← Back to incidents

AI Mental Health Apps Shared Sensitive User Data with Advertisers and Third Parties

High

Mozilla research revealed that major AI-powered mental health apps including BetterHelp shared sensitive user therapy data with advertising platforms. The FTC fined BetterHelp $7.8M for violating user privacy.

Category
Privacy Leak
Industry
Healthcare
Status
Resolved
Date Occurred
Jan 1, 2020
Date Reported
Mar 2, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
privacy
Estimated Cost
$7,800,000
People Affected
800,000
Human Review in Place
No
Litigation Filed
No
Regulatory Body
Federal Trade Commission
Fine Amount
$7,800,000
mental_healthprivacydata_sharingFTCadvertisingHIPAAdigital_healthBetterHelpMozilla

Full Description

In March 2023, Mozilla Foundation's Privacy Not Included research published findings that most popular AI-powered mental health applications were sharing highly sensitive user data with third-party advertisers and data brokers. The investigation examined 32 mental health and prayer apps, finding that 29 failed basic privacy and security standards. The research revealed that platforms like BetterHelp, Talkspace, Youper, and others had integrated Facebook pixels, Google Analytics, and other tracking technologies that automatically transmitted users' intake questionnaire responses, therapy session metadata, and personal mental health struggles to advertising networks. BetterHelp, the largest online therapy platform with over 4 million users, was found to be sharing particularly sensitive data including users' health questionnaire answers about depression and anxiety, email addresses linked to therapy sessions, and IP addresses tied to mental health consultations. The company had promised users that their data would remain confidential and would not be used for advertising purposes, but Mozilla's technical analysis revealed extensive data sharing with Facebook, Snapchat, Criteo, and Pinterest for targeted advertising purposes. The Federal Trade Commission launched an investigation into BetterHelp's practices following the Mozilla report and consumer complaints. In March 2023, the FTC announced a $7.8 million settlement with BetterHelp for sharing consumers' sensitive personal health information with third parties for advertising purposes. The settlement required BetterHelp to limit its data sharing, obtain explicit consent before sharing health information, and contact affected users about the privacy violations. The FTC found that from 2017 to 2020, BetterHelp pushed sensitive data about users' mental health struggles to Facebook and other platforms despite promising users their data would remain private. The Mozilla research also identified significant security vulnerabilities in these applications, including weak password requirements, lack of encryption for sensitive data, and failure to delete user data upon request. Many apps were found to be collecting unnecessary personal information including precise location data, contact lists, and device identifiers that were then shared with advertising networks. The investigation highlighted how AI-powered mental health tools, which often marketed themselves as private and secure alternatives to traditional therapy, were actually harvesting user data at unprecedented scale. The incident raised broader questions about the regulation of digital health tools and the adequacy of HIPAA protections for app-based mental health services. Unlike traditional healthcare providers, many mental health apps operate outside HIPAA's scope, leaving users vulnerable to data exploitation. The FTC settlement with BetterHelp represented one of the largest privacy enforcement actions in the digital health space and established important precedent for holding mental health platforms accountable for data sharing practices.

Root Cause

Mental health platforms integrated advertising tracking pixels and SDKs that automatically transmitted sensitive user data to third-party platforms. Companies failed to implement proper data minimization and consent mechanisms for health-related information.

Mitigation Analysis

Data minimization policies preventing sharing of health data, explicit opt-in consent for any third-party data sharing, regular privacy audits of tracking pixels and SDKs, and segregation of therapeutic data from marketing systems could have prevented this breach. Technical controls like differential privacy and on-device processing should be standard for mental health applications.

Lessons Learned

The incident demonstrates the need for stronger privacy regulations specific to digital health tools, particularly those operating outside traditional HIPAA protections. It highlights the importance of technical privacy audits for health applications and the risks of integrating advertising technologies with sensitive health data collection.

Sources

Mental Health and Prayer Apps Are Exceptionally Creepy
Mozilla Foundation · Mar 2, 2023 · company statement
FTC to Ban BetterHelp from Revealing Consumers' Mental Health Data to Facebook and Others
Federal Trade Commission · Mar 2, 2023 · regulatory action