YouTube and the Monetization of Inappropriate Child Content
Matt Watson, a YouTuber, posted a video on February 17 detailing his findings of what he calls a “softcore pedophile ring” that operates within YouTube’s website. He posted a link of the video to Reddit, which received comments of outrage and concern.
Since then, YouTube has received backlash for recommending videos to users that sexually exploit children. This story comes just a year after a similar scandal involving YouTube and their monetization of inappropriate child content.
In his video, Watson gives an extensive walkthrough of the YouTube “wormhole” and how searching “bikini haul” leads you to recommended videos of young girls titled “gymnastics video” or “popsicle challenge.” After two clicks, every recommended video in his sidebar is now about young girls in inappropriate thumbnails. The videos Watson clicks on are filled with sexually charged comments from pedophiles. These comments range anywhere from eggplant emojis to users timestamping moments where these young girls are in oversexualized positions.
What is worse, the videos being recommended are not only being uploaded by the young girls themselves. Pedophiles are downloading these videos and uploading them to their own channel where they have large followings of other sexual predators deepening the “wormhole.” Links to child pornogrophy sites and chatrooms are shared amongst these pedophile communities on YouTube.
Among some of these heinous videos are ads from companies like Banggood.com.
On Wednesday, February 20th, Techcrunch reported that companies such as McDonald’s, Epic, and Disney have suspended ads on all YouTube videos in what has been called “Adpocalypse.” Companies such as Peloton and Grammarly are calling on YouTube to address this issue and find a solution.
“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue. We have strict policies that govern where we allow ads to appear and we enforce these policies vigorously. When we find content that is in violation of our policies, we immediately stop serving ads or remove it altogether,” a Youtube spokesperson said in response.
The spokesperson also brought up the concern for determining what their algorithms
should be moderating in regards to child content stating that there are over 400 hours of video uploaded per minute with only 10,000 human reviewers. The lack of awareness on YouTube has sparked the hashtag #YouTubeWakeUp. Creators across the platform are protesting by halting their channels or threatening to move to other sites.
“Send it to local news outlets, send it to YouTube, send it to Buzzfeed, whoever,” Watson said near the end of his video.