Companies are pulling out of their advertising campaigns from YouTube, reports that a network of pedophiles is openly in the comments of videos of young children is active, Bloomberg reported on Wednesday. Disney and Nestlé are among those who reportedly stole the money after a YouTube video surfaced of the ongoing problem.
A video released by YouTube creator Matt Watson on Sunday described what he calls a "soft-core pedophile ring" activated by video commentators on children, especially young girls. These videos, monetized by the company, are flooded by obvious pedophiles who trade in contact information and links to child pornography. They also give a timestamp to what Watson said, "Points in the video where little girls are in compromising positions, in sexually implicit positions."
Watson described YouTube's algorithm for the emergence of these videos as a "wormhole" of exploitative content. As soon as a YouTube user clicks on several of these videos, his suggested content column is mainly flooded with videos of children.
Wired was able to replicate Watson's claims, saying that the videos he came across often include little girls who play, swim or eat popsicles and, in some cases, more graphic content. Once some of these videos have been viewed, Wired said that the algorithm of the YouTube interface is video that seems to be popular with other pedophiles. In many cases, the website reported, hundreds of thousands and even millions of views were collected from children's pre-roll ads.
Businesses now want to distance themselves from the controversy by either contacting YouTube "I can confirm that all Nestlé companies in the US have stopped advertising on YouTube," a Nestlé spokesperson said in an email -Declaration against Gizmodo. Bloomberg cited sources claiming that Disney followed suit, even though the company did not immediately return a request for comment.
A spokesman for Epic Games, the developer behind Fortnite shared Wired with his ad agency, the company had "contacted YouTube to determine what action to be taken to address this type of Remove content from their services. " Grammarly told Wired that it had also contacted YouTube.
Disturbing and predatory comments on YouTube videos of children led to a similar response from advertisers in 2017. The company said at the time that it was working to fix the problem, but it seems to remain a ubiquitous issue on the site.
A YouTube spokesperson said the company is working to solve the problem, and has disabled comments on millions of videos of children. The company has also removed more than 400 accounts of some commenters on these videos, as well as some videos that were thought to put young people at risk. The spokesperson added that YouTube reports any illegal comments to the National Center for Missing and Exploited Children.
"Any content – including commentary – that puts minors at risk is disgusting and we have clear guidelines prohibiting it on YouTube," a YouTube spokesperson said in a statement via email. "We took immediate action by deleting accounts and channels, reporting illegal activity to authorities, and turning off comments on millions of videos, including minors. There is more to do, and we continue to work to speed up and combat the abuse.