Facebook retorted against data privacy issue pounded towards the social networking site. Their response on the allegations? A new Artificial Intelligence tool that will enhance and improve the networking site’s policy against data hacking.
The media giant announced on Friday that it would launch an artificial intelligence-powered tool that is believed to help Facebook to detect “revenge porn” which has devastating consequences for those who appear in the photos. The technology is highly developed to proactively monitor and check nude images or sensual videos that are shared without permission across Facebook and Instagram.
Revenge porn as defined by TechCrunch refers to revealing or sexually explicit images or videos of a person posted on the Internet usually by a former sexual partner without the consent of the subject to cause them distress or even embarrassment.
The new technology is an additional project to Facebook’s recent program on requiring trained representatives to review and scan offending images posted or uploaded on Facebook.
On a blog post of the social networking giant, it explained the role of this AI technology so that users could grasp its main essence. According to them, by using machine learning and artificial intelligence, Facebook can now proactively detect near-nude images or videos that proliferate on the internet. Meaning to say, it would be recognized first by Facebook and automatically delete the post before the contents reach the public.
Now, the question is, who would review these contents? And who will decide whether these images or videos contain sexually explicit themes? Generally, a member of Facebook’s community operations team would review the material discovered by these technologies, and if deemed to offer offending images, the said group will have the power to remove or disable the account responsible for reproducing the illegal act.
Unlike Facebook’s earlier pilot of a photo matching technology, which allowed people to directly send their intimate photos and videos to Facebook so that the site could run the image using a digital fingerprint to stop it from being shared online, the new AI technology for revenge porn is entirely different. The latter does not need the victim’s involvement because sometimes they are too afraid to report the act themselves. And in most cases, they are not aware that their photos or videos are being shared online and get feasted on by unknown individuals.
Moreover, this newly-developed program of Facebook is a big slap on the face against iPhone and other lawmakers who accused the former of illegally paying iPhone customers 20 dollars to install an app where it can access personal information from their mobile phones. It became a worldwide issue which puts the networking site into the spotlight and later on sparked a feud against iPhone.
The issue stems from other controversies such as Facebook apathy towards data privacy. It further blames the site for not protecting its consumers’ personal information which mirrored its loose policies on administering stricter implementation regarding data sharing.
With all these accusations, Facebook’s stands on its principles that it had and will not violate its users’ data or become a perpetrator of data hacking. Facebook will also launch a support hub called ‘Not Without My Consent’ on its Safety Center page for people whose intimate images have been shared or spread without their consent. The hub will offer victims access to different organizations and resources that can support or back them up while taking the reports to Facebook.
Facebook also noted that addressing a sensitive matter such as ‘revenge porn’ can reduce the increased cases of mental health issues including anxiety, depression and worst, suicidal tendencies, that this issue currently causes.
In the following months, Facebook also plans to build victim support toolkits and will work with Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-Yeon (South Korea) to end the culture of porn proliferation globally.
Reuters tally showed that the Menlo Park a California-based company consists of at least five outsourcing vendors from at least eight countries on content review. Moreover, it had an estimated 15,000 people, a mix of contractors and employees working as content reviewers since December 2018.