A few months ago, I saw a photo of myself used by another Facebook account with a “R.I.P. (Rest in Peace)” imposed on it. Apparently, it was done by a woman who I had an argument with on Facebook regarding some issues. I immediately reported the photoshopped photo and the profile of those who shared and used them as their profile picture. But when the report response from Facebook arrived, they said that the use of my photo to threaten my life did not violate the “Community Standards.”
My story is just one of the many instances that Facebook tolerates harassment in its platform; no matter if it was reported by the victim or not. No matter if the reported content suggests violence, or not.
Facebook prides itself as a social media platform aims to build communities and to bring people together. As the popularity of the tech giant grows, different kinds of people from different walks of life have already established a Facebook identity in the past years. But as a community, Facebook has seen people with bad behavior in its platform. Besides, Facebook has become an alternative universe for users, making it an online microcosm of society.
Bullies, racists, sexists, sexual harassers, even terrorists exist on Facebook. It is with their existence that Facebook decided to establish a set of “community standards” for users to follow in order for everyone to have a harmonious experience in the platform.
“We recognize how important it is for Facebook to be a place where people feel empowered to communicate, and we take our role in keeping abuse off our service seriously. That’s why we have developed a set of Community Standards that outline what is and is not allowed on Facebook,” they said on their website.
“The goal of our Community Standards is to encourage expression and create a safe environment. We base our policies on input from our community and from experts in fields such as technology and public safety.”
To facilitate their policing of inappropriate and harassing contents on their platform, Facebook allows users to report cases to their site so that the company can take down problematic materials.
But Facebook is not doing a good job of responding nor of taking actions for these reports.
There have been several instances where Facebook did not respond to reports even in cases where the report involves an actual danger. Early this month, during the Christchurch shooting, a live video of the incident has been broadcasted on the social media site, and Facebook did not take it down until more than 15 minutes through the video. In their defense, Facebook said that their late response was because no one has reported its existence.
However, Jared Holt, a reporter for Right Wing Watch, said he was alerted to the live stream and reported it immediately during the attack. “I was sent a link to the 8chan post by someone who was scared shortly after it was posted. I followed the Facebook link shared in the post. It was mid-attack, and it was horrifying. I reported it,” Holt tweeted. “Either Facebook is lying, or their system wasn’t functioning properly.”
“I definitely remember reporting this, but there’s no record of it on Facebook. It’s very frustrating,” Holt said. “I don’t know that I believe Facebook would lie about this, especially given the fact law enforcement is likely asking them for the info, but I’m so confused as to why the system appears not to have processed my flag.”
Because of the inability of Facebook to timely respond to the report made by Holt for whatever reasons, a video of the attack that insults the people of New Zealand and the rest of the Muslim community around the world is now circulating on the internet.
In an environment where a content – problematic or not – can be broadcasted to a very wide audience, time is of the essence. The longer that the harassing post or image is in circulation, the more people can see it, aggravating the harassment that has already been happening.
Stories of people who said that they were unable to have Facebook take down fake news or other harassing contents unless a surge of the report has been sent from many users have also been made. Conversely, if many people report a content, even if it does not really violate the Community Standards, they were taken down.
This is very problematic, and Facebook should do better. Not only that the inability of Facebook to appropriately respond to reports made by its users is putting some people in real danger, but it also sends a message that Facebook does not care about the safety and security of their users.
At the end of the day, Facebook has the responsibility to protect its users from harassment. The company may have acknowledged that responsibility by setting a Community Standard, they definitely were too bad at it and they have to improve. /apr