← Back to incidents
NEDA AI Chatbot Tessa Gave Harmful Diet Advice to Eating Disorder Support Seekers
HighNEDA's AI chatbot Tessa gave harmful diet advice including calorie restriction to eating disorder support seekers in May 2023, forcing immediate suspension within days.
Category
Medical Error
Industry
Healthcare
Status
Resolved
Date Occurred
May 31, 2023
Date Reported
Jun 1, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Model
Tessa
Application Type
chatbot
Harm Type
physical
Human Review in Place
No
Litigation Filed
No
eating_disordermental_healthcrisis_supportharmful_advicevulnerable_populationchatbotNEDA
Full Description
In May 2023, the National Eating Disorders Association (NEDA) launched an AI chatbot named Tessa to replace human-staffed helplines as part of cost-cutting measures. The organization positioned Tessa as a body-positive chatbot designed to provide support to individuals struggling with eating disorders. However, within days of launch, the system began providing dangerous advice that directly contradicted eating disorder recovery principles.
Activist Sharon Maxwell documented several concerning interactions with Tessa on May 31, 2023. When Maxwell posed as a 13-year-old seeking advice, Tessa recommended a daily caloric deficit and provided specific weight loss strategies. The chatbot suggested eating in a deficit of 500-1000 calories per day and recommended tracking calories and macronutrients. This advice was particularly dangerous given NEDA's mission to support eating disorder recovery, where calorie restriction and weight loss advice are considered harmful triggers.
The incident gained widespread attention when Maxwell shared screenshots of the harmful interactions on social media platforms. Mental health professionals and eating disorder advocates expressed outrage that an organization dedicated to eating disorder support would deploy technology that could actively harm vulnerable individuals. The advice provided by Tessa directly contradicted established eating disorder treatment protocols and could potentially trigger relapse in individuals seeking recovery.
NEDA responded swiftly to the crisis, suspending Tessa on June 1, 2023, just one day after the harmful interactions were publicized. The organization initially stated that the chatbot was temporarily unavailable for maintenance but later acknowledged the serious nature of the incident. NEDA faced significant criticism for the decision to replace human crisis counselors with AI technology without adequate safeguards for such a vulnerable population.
The incident highlighted broader concerns about the ethics of replacing human crisis support with AI systems. Mental health professionals argued that eating disorder support requires nuanced understanding, empathy, and clinical expertise that current AI systems cannot safely provide. The timing was particularly controversial as NEDA had recently laid off human helpline staff, making the AI replacement appear to prioritize cost savings over user safety. Following the incident, NEDA announced they would be reevaluating their use of AI technology and working to restore human-supported services.
Root Cause
AI chatbot lacked proper safeguards and training data to recognize harmful content patterns. The system was not adequately programmed to detect and refuse requests that could trigger eating disorder behaviors.
Mitigation Analysis
Human oversight and specialized clinical review protocols were essential for this vulnerable population. Content filtering specifically designed for eating disorder triggers, extensive testing with mental health professionals, and maintaining human backup support could have prevented this incident. Crisis support systems require clinical expertise that current AI cannot safely replace.
Lessons Learned
AI systems should never replace human crisis support without extensive clinical oversight and fail-safes. Organizations serving vulnerable populations must prioritize safety over cost efficiency when deploying AI technologies.
Sources
AI Chatbot Encouraged Eating Disorder Behaviors
VICE · Jun 1, 2023 · news
Eating disorder helpline disables chatbot for giving harmful advice
NBC News · Jun 2, 2023 · news