← Back to incidents
FTC Fines BetterHelp $7.8M for Sharing Mental Health Data with Advertisers
HighThe FTC fined BetterHelp $7.8 million for sharing sensitive mental health data from over 7 million users with Facebook, Snapchat, and other advertisers for targeted marketing between 2017-2020, violating privacy promises.
Category
Privacy Leak
Industry
Healthcare
Status
Resolved
Date Occurred
Dec 1, 2017
Date Reported
Mar 2, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
privacy
Estimated Cost
$7,800,000
People Affected
7,000,000
Human Review in Place
No
Litigation Filed
No
Regulatory Body
Federal Trade Commission (FTC)
Fine Amount
$7,800,000
mental_healthdata_privacyHIPAAadvertisinguser_consenthealth_dataFTC_enforcement
Full Description
BetterHelp, the world's largest online mental health platform, faced significant regulatory action in March 2023 when the Federal Trade Commission imposed a $7.8 million fine for sharing users' sensitive mental health data with major advertising platforms. The company, which provides online therapy and counseling services, violated its own privacy policies by transmitting highly personal information to Facebook, Snapchat, Criteo, and Pinterest for targeted advertising purposes.
The FTC's investigation revealed that between December 2017 and July 2020, BetterHelp collected extensive personal information from users including email addresses, IP addresses, health questionnaire responses indicating conditions like depression and anxiety, and details about therapy sessions. The company then shared this data through tracking pixels and other mechanisms with advertising platforms, often by hashing email addresses along with mental health indicators to create targeted advertising profiles.
The data sharing practices were particularly egregious because BetterHelp had explicitly promised users in its privacy policy that it would not sell or share their personal information with third parties for advertising purposes. Users seeking mental health support had reasonable expectations of privacy protection, especially given the sensitive nature of their therapy sessions and mental health conditions. The FTC found that BetterHelp's actions constituted deceptive practices under Section 5 of the FTC Act.
The settlement agreement, announced on March 2, 2023, required BetterHelp to pay the $7.8 million penalty and implement comprehensive privacy safeguards. The company must obtain explicit opt-in consent before sharing any health information with third parties, delete previously shared data where possible, and implement a comprehensive privacy program with regular audits. Additionally, BetterHelp is prohibited from misrepresenting its data practices and must provide clear disclosures about any data sharing.
The case affected approximately 7 million BetterHelp users who had provided sensitive mental health information with the expectation of privacy protection. Many users were unaware that their therapy-related data was being used for commercial advertising purposes, representing a significant breach of trust in the mental health services sector. The incident highlighted broader concerns about data privacy in digital health platforms and the need for stronger protections for sensitive health information.
The FTC's action against BetterHelp represents one of the largest privacy enforcement cases in the mental health sector and established important precedents for data protection in digital therapy platforms. The settlement also included provisions requiring BetterHelp to notify users about the data sharing that had occurred and provide them with information about how to limit future data collection by the advertising platforms that had received their information.
Root Cause
BetterHelp implemented tracking pixels and data sharing mechanisms that transmitted users' sensitive mental health information, including email addresses hashed with depression and anxiety indicators, to advertising platforms despite privacy promises to users.
Mitigation Analysis
Proper data governance controls including data classification, privacy impact assessments, and consent management systems could have prevented this breach. Technical controls like data loss prevention, encryption of sensitive identifiers, and strict API access controls would have blocked unauthorized data sharing. Regular privacy audits and compliance monitoring would have detected the inappropriate data flows to advertising partners.
Lessons Learned
This case demonstrates that privacy promises in healthcare applications must be backed by robust technical and operational controls, as regulatory penalties for health data misuse can be severe. Companies handling sensitive health data must implement comprehensive privacy-by-design approaches and ensure marketing practices align with privacy commitments to users.
Sources
FTC Bans BetterHelp from Revealing Consumers' Mental Health Data to Facebook, Snapchat
Federal Trade Commission · Mar 2, 2023 · regulatory action
BetterHelp to Pay $7.8 Million to Settle FTC Charges Over Sharing Health Data
Wall Street Journal · Mar 2, 2023 · news