YouTube algorithms consistently push eating disorder and self-harm content to teen girls, new study finds
Anna Mockel was 14 and suddenly obsessed with losing weight. It was spring 2020, and she had just graduated eighth grade remotely. Housebound and nervous about the transition to high school that coming fall, she sacrificed innumerable hours that COVID lockdown summer shuffling between social media apps.
Anna spent a lot of time on YouTube "not searching for anything in particular," just watching what popped up in her feed. She remembers the spiraling thoughts started when she'd watch videos featuring girls who were a bit older and invariably skinny. The more Anna watched, the more these videos would clog her feed, and the more determined she was to look like the girls in the videos.
As she clicked and tapped, YouTube's "Up Next" panel of recommended videos started morphing from content featuring skinny girls to "how-tos" on losing weight. Diet and exercise videos began to dominate Anna's account. As she kept watching, she says, the content intensified, until her feed was flooded with videos glorifying skeletal-looking bodies and hacks for sustaining a 500-calorie daily diet. (Adolescent girls are recommended 2,200 in daily caloric intake.)
"I didn't know that that was even a thing online," Anna says of the eating disorder content recommended to her. "A lot of it just came up in my feed, and then I gravitated towards that because it's what was already going on for me."
Anna copied what she saw, restricted her diet and began losing weight at an alarming pace. At 14, she says that she was aware of eating disorders but "didn't connect the dots" until she was diagnosed with anorexia. Over the next years, she would endure two hospitalizations and spend three months at a residential treatment center before beginning her recovery at age 16.
Now 18 and a high school senior, she asserts that social media, YouTube in particular, perpetuated her eating disorder.
"YouTube became this community of people who are competitive with eating disorders," she says. "And it kept me in the mindset that [anorexia] wasn't a problem because so many other people online were doing the same thing."
Now, new research confirms this content was served to Anna intentionally. A report released Tuesday by the Center for Countering Digital Hate asserts that when YouTube users demonstrate signs of being interested in diet and weight loss, almost 70% of the videos pushed by the platform's algorithms recommend content that likely worsens or creates anxieties about body image.
What's more, the videos average 344,000 views each—nearly 60 times that of the average YouTube video—and come embroidered with ads from major brands like Nike, T-Mobile and Grammarly. It's unclear whether the companies are aware of the ad placements.
"We cannot continue to let social media platforms experiment on new generations as they come of age," says James P. Steyer, Founder and CEO of Common Sense Media, a nonprofit dedicated to educating families about online safety.
He says these platforms are designed to keep viewers' attention even if that means amplifying harmful content to minors.
The report, titled "YouTube's Anorexia Algorithm," examines the first 1,000 videos that a teen girl would receive in the "Up Next" panel when watching videos about weight loss, diet or exercise for the first time.
To collect the data, CCDH's researchers created a YouTube profile of a 13-year-old girl and carried out 100 searches on the video-sharing platform using popular eating disorder keywords such as "ED WIEIAD" (eating disorder, what I eat in a day), "ABC diet" (anorexia boot camp diet) and "safe foods" (a reference to foods with few or no calories). The research team then analyzed the top 10 recommendations YouTube's algorithm pushed to the "Up Next" panel.
The results indicated that almost two-thirds (638) of the recommended videos pushed the hypothetical 13-year-old user further into eating disorder or problematic weight loss content; one-third (344) of YouTube's recommendations were deemed harmful by the CCDH, meaning the content either promoted or glamorized eating disorders, contained weight-based bullying or showed imitable behavior; 50 of the videos, the study found, involved self-harm or suicide content.
"There's this anti-human culture created by social media platforms like YouTube," says Imran Ahmed, founder and CEO of the Center for Countering Digital Hate. "Kids today are essentially reeducated by algorithms, by companies teaching and persuading them to starve themselves."
Ahmed says the study illustrates the systemic nature of the issue, that YouTube, owned by Google, is violating its own policies by allowing this content on the platform.
YouTube is the most popular social media site among teens in the US, ahead of TikTok and Instagram, according to Pew Research Center. Three quarters of U.S. teens say they use the platform at least once a day. YouTube does not require a user to create an account to view content.
The Social Media Victims Law Center, a Seattle-based law firm founded in response to the 2021 Facebook Papers, has filed thousands of lawsuits against social media companies, including YouTube. More than 20 of those suits allege that YouTube is designed to be intentionally addictive and perpetuate eating disorders in its users, particularly among teen girls.
The law firm connected 60 Minutes with a 17-year-old client. Her experience mirrors that of Anna.
"YouTube taught me how to have an eating disorder," says the 17-year-old, whose lawsuit accuses YouTube of knowingly perpetuating anorexia. She says she created a YouTube account when she was 12. She'd log on to watch dog videos and gymnastics challenges and cooking tutorials. Then, she says, she started seeing videos of girls dancing and exercising. She'd click. YouTube recommended more videos of girls doing more extreme exercises, which turned into videos of diets and weight loss. She kept watching; she kept clicking.
She says her feed became a funnel for eating disorder content, a stream of influencers promoting extreme diets and ways to "stay skinny." She spent five hours a day on YouTube, learning terms like "bulimia" and "ARFID" (Avoidant/restrictive food intake disorder). She learned what it meant to "purge" and "restrict" food; she became deeply concerned about caloric intake and her BMI (body mass index.)
When she was in seventh grade, she stopped eating. She was diagnosed with anorexia shortly after, and over the next five years, she says she'd spend more time out of school than in it. Now a junior in high school, she's been hospitalized five times and spent months at three residential treatment centers trying to recover from the eating disorder.
"It's just taken my life away pretty much," she reflects.
Asked why algorithms are employed not to protect young users but to intentionally recommend eating disorder content, YouTube declined to comment.
The video sharing site says it "continually works with mental health experts to refine [its] approach to content recommendations for teens." In April 2023, the platform expanded its policies on eating disorders and self-harm content, adding the ability to age restrict videos that contain "educational, documentary, scientific or artistic" disordered eating or that discuss "details which may be triggering to at-risk viewers." Under this policy, these videos may be unavailable to viewers under 18.
YouTube has taken steps to block certain search terms like "thinspiration," a word used to find footage of emaciated bodies. However, the CCDH study found that such videos still appear in the "Up Next" panel. And users learn that by subbing in a zero for the letter "O" or an exclamation point for the letter "I," these terms are still searchable on YouTube. One video noted in the report as glorifying skeletal body shapes had 1.1 million views at the time of the analysis; it now has 1.6 million.
As part of the research, CCDH flagged 100 YouTube videos promoting eating disorders, weight-based bullying or showing imitable behavior. YouTube removed or age-restricted only 18 of those videos.