YouTube algorithms systematically spread content about eating disorders and self-harm to teenage girls, new study finds

YouTube algorithms systematically spread content about eating disorders and self-harm to teenage girls, new study finds

Anna Mockel was 14 and suddenly obsessed with losing weight. It was spring 2020 and she had just graduated from eighth grade remotely. Homebound and nervous about the transition to high school next fall, she sacrificed countless hours during the summer of COVID lockdown, switching from a social media app to ‘other.

Anna spent a lot of time on YouTube “not looking for anything in particular,” just watching what appeared in her feed. She remembers the spiraling thoughts that began when she watched videos featuring girls who were a little older and still skinny. The more Anna watched, the more these videos clogged her feed and the more determined she was to look like the girls in the videos.

As she clicked and typed, YouTube’s “Up Next” recommended videos panel began to transform from the content featuring skinny girls to “how to” lose weight. Diet and exercise videos began to dominate Anna’s account. As she continued watching, she says, the content intensified, until her feed was flooded with videos glorifying skeletal-looking bodies and tips for maintaining a daily diet of 500 calories. (Adolescent girls are recommended a daily calorie intake of 2,200.)

“I didn’t even know it existed online,” Anna says of the eating disorder content she was recommended. “A lot of things popped up in my feed and then I gravitated towards that because that’s what was already happening for me.”

Anna copied what she saw, restricted her diet and began losing weight at an alarming rate. At 14, she says she was aware of her eating disorder, but “didn’t make the connection” until she was diagnosed with anorexia. Over the next few years, she would undergo two hospitalizations and spend three months in a residential treatment center before beginning her recovery at age 16.

Now 18 and a high school student, she says social media, YouTube in particular, perpetuated her eating disorder.

“YouTube has become this community of people competing with eating disorders,” she says. “And it kept me in mind that [anorexia] It wasn’t a problem because so many other people online were doing the same thing. »

Now, new research confirms that this content was intentionally served to Anna. A report released Tuesday by the Center for Countering Digital Hate says that when YouTube users show signs of interest in diet and weight loss, nearly 70% of videos pushed by the platform’s algorithms recommend content likely to aggravate or create anxieties about body image.

Additionally, the videos average 344,000 views each, nearly 60 times that of an average YouTube video, and are adorned with ads from big brands like Nike, T-Mobile, and Grammarly. It is unclear whether the companies are aware of the ad placements.

“We cannot continue to let social media platforms experiment on new generations as they grow up,” says James P. Steyer, founder and CEO of Common Sense Media, a nonprofit organization dedicated to educating families about online safety.

He says these platforms are designed to hold viewers’ attention, even if that means amplifying content harmful to minors.

The report, titled “YouTube’s Anorexia Algorithm,” examines the first 1,000 videos a teenager would receive in the “Up Next” panel when she first watches weight loss videos, a diet or exercise.

To collect the data, CCDH researchers created the YouTube profile of a 13-year-old girl and conducted 100 searches on the video-sharing platform using popular eating disorder keywords such as “ ED WIEIAD” (eating disorder, what I eat in a day), the “ABC diet” (boot camp diet for anorexia), and “safe foods” (a reference to foods containing little or no calories). The research team then analyzed the top 10 recommendations pushed by YouTube’s algorithm to the “Up Next” panel.

Results indicated that nearly two-thirds (638) of the recommended videos pushed the hypothetical 13-year-old user further toward eating disorders or problematic weight loss content; a third (344) of YouTube recommendations were judged harmful by the CCDH, meaning the content promoted or glorified eating disorders, contained weight-based bullying, or demonstrated copycat behavior; According to the study, 50 of the videos involved self-harm or suicide content.

“There is this anti-human culture created by social media platforms like YouTube,” says Imran Ahmed, founder and CEO of the Center for Countering Digital Hate. “Children today are essentially being re-educated by algorithms, by companies teaching them and persuading them to starve.”

Ahmed says the study illustrates the systemic nature of the problem, that Google-owned YouTube is violating its own policies by allowing this content on the platform.

YouTube is the most popular social network among teenagers in the United States, ahead of TikTok and Instagram, according to the Pew Research Center. Three-quarters of American teenagers say they use the platform at least once a day. YouTube does not require a user to create an account to view content.

The Social Media Victims Law Center, a Seattle-based law firm founded in response to the Facebook Documents 2021has filed thousands of lawsuits against social media companies, including YouTube. More than 20 of these lawsuits allege that YouTube is designed to intentionally create addiction and perpetuate eating disorders among its users, particularly among teenage girls.

The law firm connected 60 Minutes with a 17-year-old client. Her experience mirrors Anna’s.

“YouTube taught me how have an eating disorder,” says the 17-year-old, whose lawsuit accuses YouTube of knowingly perpetuating anorexia. She says she created a YouTube account when she was 12. She logged in to watch dog videos, gymnastics challenges and cooking Then, she says, she started seeing videos of girls dancing and exercising. YouTube recommended more videos of girls doing more extreme exercises. turned into videos diets and weight loss.

She says her feed has become a funnel for eating disorder content, a feed of influencers promoting extreme diets and ways to “stay thin.” She spent five hours a day on YouTube, learning terms like “bulimia” and “ARFID” (avoidance/restrictive food intake disorder). She learned what it meant to “purge” and “restrict” food; she became deeply concerned about her calorie intake and BMI (body mass index).

When she was in seventh grade, she stopped eating. She was diagnosed with anorexia shortly afterward, and over the next five years she says she would spend more time outside of school than in school. Now a high school student, she has been hospitalized five times and spent months in three residential treatment centers trying to recover from her eating disorder.

“It almost cost me my life,” she reflects.

When asked why algorithms are used not to protect young users but to intentionally recommend eating disorder content, YouTube declined to comment.

The video-sharing site says it “continuously works with mental health experts to refine [its] approach to content recommendations for teens. “In April 2023, the platform expanded its policies on eating disorders and self-harm content, adding the ability to age-restrict videos that contain educational, documentary “eating disorders.” , scientific or artistic or that discuss “details that may raise alerts for at-risk viewers. Under this rule, these videos may not be available to viewers under 18.”

YouTube has taken steps to block certain search terms like “thinspiration,” a word used to find images of emaciated bodies. However, the CCDH study found that such videos still appear in the “Up Next” panel. And users learn that by replacing a zero with the letter “O” or an exclamation point with the letter “I,” those terms can still be searched on YouTube. A video described in the report as glorifying skeletal body shapes had been viewed 1.1 million times at the time of analysis; it now has 1.6 million.

As part of the research, CCDH flagged 100 YouTube videos promoting eating disorders, weight-based bullying, or showing copycat behavior. YouTube removed or age-restricted only 18 of these videos.