← Back to incidents
TikTok Algorithm Promoted Pro-Anorexia and Self-Harm Content to Teen Users
HighResearch by the Center for Countering Digital Hate revealed TikTok's algorithm rapidly promoted pro-anorexia and self-harm content to teen users within minutes of engagement, leading to state attorney general investigations and ongoing litigation.
Category
Safety Failure
Industry
Technology
Status
Ongoing
Date Occurred
Jan 1, 2022
Date Reported
Dec 14, 2022
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
physical
People Affected
13,000,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
pending
Regulatory Body
State Attorneys General Coalition
TikTokeating_disordersself_harmteensalgorithmrecommendation_systemmental_healthcontent_moderationCCDH
Full Description
In December 2022, the Center for Countering Digital Hate (CCDH) published research demonstrating that TikTok's recommendation algorithm systematically promoted dangerous content related to eating disorders and self-harm to teenage users. The study created fake accounts posing as 13-year-old users and tracked how quickly the platform's algorithm began recommending harmful content after users showed initial interest in weight loss or mental health topics.
The research found that within 2.6 minutes of creating accounts, TikTok began serving content promoting eating disorders, self-harm, and suicide to the simulated teen users. The algorithm rapidly escalated from relatively benign diet content to extremely dangerous material, including videos promoting severe caloric restriction, purging behaviors, and self-injury techniques. Within 20 minutes, every 39 seconds of content recommended was related to mental health or body image issues, creating an overwhelming stream of potentially triggering material.
TikTok's algorithmic amplification created what researchers termed a 'rabbit hole' effect, where users expressing any interest in weight loss or mental health topics were immediately funneled toward increasingly extreme content. The platform's 'For You Page' algorithm, designed to maximize user engagement and time-on-platform, appeared to treat user distress signals as engagement opportunities rather than warning signs requiring intervention.
Following the CCDH report, multiple state attorneys general launched investigations into TikTok's practices, with particular focus on the platform's impact on teen mental health. Several lawsuits were filed on behalf of families claiming TikTok's algorithm contributed to their children's eating disorders and self-harm behaviors. TikTok responded by announcing policy changes including expanded content warnings and restrictions on diet-related content for users under 18, though critics argued these measures were insufficient given the algorithmic amplification documented in the research.
The incident highlighted broader concerns about recommendation algorithms optimized for engagement without adequate consideration of user wellbeing, particularly for vulnerable populations. With over 150 million U.S. users and approximately 13 million teens actively using the platform, the potential scope of harm raised significant questions about algorithmic accountability and platform responsibility for content amplification patterns that could influence user behavior in dangerous ways.
Root Cause
TikTok's recommendation algorithm was trained to maximize engagement without adequate safeguards against harmful content promotion. The system rapidly amplified dangerous content to users who showed initial interest, creating harmful feedback loops that pushed increasingly extreme material to vulnerable teenagers seeking eating disorder or self-harm content.
Mitigation Analysis
Implementation of content provenance tracking could have identified harmful content sources and patterns. Human review of flagged eating disorder content, coupled with algorithmic bias testing specifically for vulnerable populations, would have revealed the amplification of dangerous material. Real-time monitoring of recommendation patterns for minors and circuit breakers to prevent rapid escalation to extreme content could have significantly reduced harm.
Lessons Learned
The incident demonstrates the critical need for algorithmic impact assessments focused on vulnerable populations, particularly when recommendation systems optimize for engagement metrics that may conflict with user safety. Platforms must implement specific protections against harmful content amplification for minors, including circuit breakers and human oversight of algorithmic recommendations in sensitive content areas.
Sources
Deadly by Design: How TikTok's algorithm promotes eating disorder and self-harm content
Center for Countering Digital Hate · Dec 14, 2022 · academic paper
TikTok served disordered eating content to teens within minutes, study finds
Washington Post · Dec 14, 2022 · news
Study finds TikTok quickly shows eating disorder content to teen accounts
NPR · Dec 14, 2022 · news