YouTube's Algorithm Keeps Suggesting Home Videos Of Kids To Pedophiles
SANTA CLARA (KPIX 5) -- YouTube is coming under fire following a new report that says the site's recommendation algorithm suggests home videos featuring children to pedophiles.
Citing an "extra cautious approach" when dealing with kids' safety, the San Bruno-based company has announced expanded efforts to protect underage users from exploitation and abuse.
Harvard researcher Jonas Kaiser was researching the effects of YouTube on Brazilian society when he made the disturbing discovery.
"50 or so channels that had basically content that was all sexual in nature or sexualized, some of them over age, some of them definitely underage. And from there, we then looked at YouTube's recommendations from these channels and basically stumbled into this world of child exploitation videos," Kaiser explained.
Kaiser was referring to "outlier" videos, seemingly innocent clips of "children being children" at swim meets, gymnastics competitions, or ballet performances, but with unusually high view counts, often 250,000 or more. The high view counts were all the more noticeable, given that others videos uploaded by the same user would number just a few dozen.
"The only real difference with this video with more views was the child had less clothing on," Kaiser said. "Some of these videos were picked out for whichever reason and then recommended through the algorithm."
"This just shines a spotlight on how algorithms work, that they will just keep showing people what they want to see. In this case, it caters to pedophiles," Kaiser said.
Kaiser's research was featured in the New York Times with the headline "On YouTube's Digital Playground, An Open Gate For Pedophiles."
That story came out shortly after a report earlier in the year by Wired Magazine that also found pedophiles were communicating with each other in the comments section of some videos on the site.
KPIX 5 reached out to YouTube, who responded by pointing to a long blog post saying they had disabled comments on videos featuring minors, reduced video recommendations featuring minors in risky situations and deleted 800,000 videos for violating child safety policies.
Irina Raicu, the director of the Internet Ethics Program at Santa Clara University, says YouTube has an obligation to educate users on the risks and dangers of uploading videos on their site.
"I think if they really mean that their goal is to protect children, they're not doing nearly enough, and they're not moving quickly enough," Raicu said. "Should there be something that YouTube does to inform people that this is what's going to happen? I absolutely think they do have that responsibility."
Kaiser said he's encouraged by YouTube's response after passing along his research findings, but he also said he'll never post photos or videos of his children online.
"For now, they are implementing changes which have been heartening, and we'll see how much good this will bring and how it will protect children. But, the first steps seem to be good," Kaiser said.