In yet another step to scrape pedophiles off the bottom of its shoe, YouTube announced on Monday that it’s banning youngsters from live-streaming without adult supervision and that it’s limiting recommendations of videos that depict “minors in risky situations.”
In February, YouTube disabled comments on millions of videos featuring minors, in response to reports that creeps were leaving disgustingly sexual comments on videos featuring kids doing things like yoga or gymnastics, or playing games such as Twister.
At the same time, YouTube also implemented a classifier – a machine learning tool that helps to identify specific types of content – that it says helped it remove a significant number of violative comments.
It didn’t catch them all. On Monday, the New York Times published a writeup of research showing that YouTube’s automated recommendation system (which suggests what to watch next and which drives most of YouTube’s billions of views) was, months after the move to disable comments on kids’ videos, suggesting videos of partially clothed kids (think two-piece swimsuits) to users who watched “other videos of prepubescent, partially clothed children.”
Three researchers at Harvard’s Berkman Klein Center for Internet and Society – Jonas Kaiser, Yasodara Córdova and Adrian Rauchfleisch – stumbled onto the videos while looking into YouTube’s impact in Brazil, the Times reports.
They set a server up to open videos, then followed YouTube’s top recommendations for what to watch next. What they found was that you don’t need to look for videos of children to end up watching them: rather, you get sucked into a “black hole” where the platform offers a progression of recommendations that spirals in until you’re watching videos of young children. From the Times:
So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children’s clothes. Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split.
YouTube won’t stop recommending kid videos
When alerted to the clusters of videos created by YouTube’s recommendation system, the platform took many down. But it hasn’t changed the one thing that the researchers say would fix this: it hasn’t turned off its recommendation system on videos of children, even though the platform automatically identifies the videos.
YouTube says that doing so would hurt the video creators who rely on those clicks. As it is, creators have felt like they’ve been punished for doing nothing wrong multiple times. The first was after the so-called Adpocalypse 1.0, when major advertisers yanked ads appearing on YouTube videos that espoused extremism and hate speech. YouTube responded by abruptly rolling out changes to its automated processes for placing ads across the platform.
The other move that caused creators to complain came after Adpocalypse 2.0 in February, when advertisers again pulled their ads in response to pedophiles filling up the comments sections, prompting YouTube to disable comments.
Though it’s shying away from ticking off content creators yet again, YouTube did say that it would limit recommendations on videos that it thinks put children at risk.
It said that it started reducing recommendations of “borderline content” earlier this year. Borderline content itself isn’t violative, YouTube said, but it’s aware that it could put minors at risk of online or offline exploitation. The platform says it’s already applied the changes to tens of millions of videos.
No more live-streaming without an adult nearby
In Monday’s announcement, YouTube said that it had updated its live-streaming policy to disallow “younger minors” from live-streaming unless there’s clearly an adult nearby. Channels that don’t comply could lose their ability to live stream.
The platform also launched new classifiers to find and remove the now-violative content.
From the blog post:
Responsibility is our number one priority, and chief among our areas of focus is protecting minors and families.
With this update, we’ll be able to better identify videos that may put minors at risk and apply our protections […] across even more videos.