YouTube Recommendation Algorithm Remains Threat Against Pedophiles

ad1

Barely three months after YouTube exposed as a “soft-core pedophile ring” in February, The New York Times reports today that the issue remains persistent through its recommendations algorithm.

YouTube is a massive social media platform that offers hours and hours of video content uploaded by millions of people from all over the world through individual accounts called “channels.” It’s an FTC-approved platform where people can watch videos according to their liking. However, it has also become a source of disturbing intentions like pedophiles who flocked the platform with contents that aim fulfill their fetishes.

In February, YouTuber Matt Watson uploaded a video where he detailed how the “soft-core pedophile ring” operated in YouTube. He explained that pedophiles posted initials CP for child pornography and video time stamps to allow others to skip to seemingly innocuous parts of videos of children playing in a pool or doing a “bikini haul.”

“On its own, each video might be perfectly innocent, a home movie, say, made by a child. Any revealing frames are fleeting and appear accidental. But, grouped together, their shared features become unmistakable,” Times states.

The timestamps shared by CP accounts show parts of the videos with exposed genitals can be seen, or when a child is in a compromised position such as doing splits or accidentally lifting her shirt too high and exposing her nipples. Some girls in these videos are as young as five years old but their videos amass views reaching hundreds of thousands, if not millions, with even more comments.

In February, YouTube has immediately acknowledged the severity of the issue and enacted to address the problem by deleting hundreds of thousands of accounts relating to pedophilia and has banned the comments section on videos with children in it.

“YouTube’s terms of service state that children under the age of 13 aren’t allowed to have their own accounts, but many of these innocuous videos are uploaded by older family members. Many children are also key components to an entire genre on YouTube known as “family vlogging.” Creators like The Ace Family (16.4 million subscribers), Tydus and Cor (2.8 million subscribers), Daily Bumps (4.6 million subscribers), and Roman Atwood Vlogs (15.2 million subscribers), put their kids front and center,” The Verge explains.

YouTube’s initial decision to ban the comments section with videos involving minors affected the majority of “family vloggers.” The traffic or number of comments in a video is put into consideration when the platform’s recommendation algorithm decides which videos a viewer might want to watch next.

YouTube’s automated recommendation system — which drives most of the platform’s billions of views by suggesting what users should watch next —is part of a phenomenon on the platform called a wormhole effect, as Watson calls it.

Mainly, YouTube hooks its viewers by recommending related but different videos to the one you’re currently watching. For example, a person could be watching an Ace Family video about how they get their kids to get ready for school — YouTube would then recommend kids playing in a park. In other themes, this would lead viewers to continually consume more content without getting tired of watching a single dimensional set of videos.

The moment YouTube decides to recommend seemingly random videos of children, it becomes a moment for pedophiles to take advantage. According to Time, YouTube has unconsciously created a convenient archive of children in various compromising situations for pedophiles to consume. Parents upload these videos with pure intentions such as sharing how their kids are enjoying the moment — much like a home video.

“It’s YouTube’s algorithm that connects these channels,” said Jonas Kaiser, one of three researchers at Harvard’s Berkman Klein Center for Internet and Society who stumbled onto the videos while looking into YouTube’s impact in Brazil. “That’s the scary thing.”

Specifically, it’s not the pedophiles who point which videos they should watch — its YouTube’s algorithm who’s responsible for suggesting videos that are seemingly popular with other pedophiles, most of which have hundreds of thousands of views and dozens of disturbing comments.

The extraordinary view counts and comments indicated that the system found an audience for the videos, and was “keeping that audience engaged,” says Yasodara Córdova, a researcher who has studied the distribution of online pornography.

On the other hand, YouTube completely disabling the recommendations feature for channels with minors in their videos would affect family vloggers the most. Established channels could potentially lose millions of views. Meanwhile, up and coming family-oriented channels could find it hard to grow on YouTube — being a YouTube creator has developed to become a source of income for some.

YouTube is currently working on a balance that would eliminate threats posed by pedophiles all the while considering the position of family vloggers in their platform.

Leave a Reply

Your email address will not be published. Required fields are marked *