← Back to incidents
Siri AI Assistant Recorded Private Conversations and Sent to Apple Contractors
HighApple's Siri assistant inadvertently recorded private conversations due to false wake word triggers, with contractors regularly hearing confidential medical information and intimate moments, leading to a $95M settlement.
Category
Privacy Leak
Industry
Technology
Status
Resolved
Date Occurred
Jul 1, 2019
Date Reported
Jul 26, 2019
Jurisdiction
US
AI Provider
Other/Unknown
Model
Siri
Application Type
agent
Harm Type
privacy
Estimated Cost
$95,000,000
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
settled
voice_assistantprivacy_violationhuman_reviewfalse_activationwiretappingclass_action
Full Description
In July 2019, The Guardian published an investigation revealing that Apple contractors were routinely listening to confidential conversations recorded by Siri without users' knowledge or consent. A whistleblower contractor disclosed that they regularly heard private medical information, drug deals, business dealings, and couples having sex through recordings that were supposed to only capture intended Siri interactions.
The privacy breach occurred because Siri's voice activation system frequently triggered false positives, recording conversations when users had not intentionally activated the assistant. These accidental recordings were then sent to human contractors as part of Apple's quality assurance program to improve Siri's performance. The contractors, working for GlobalTech in Cork, Ireland, were tasked with grading Siri's responses and determining whether activations were deliberate, but in the process were exposed to highly sensitive personal information.
According to the whistleblower, accidental activations made up a significant portion of the recordings reviewed, with Apple Watch and HomePod devices being particularly prone to false triggers due to their always-listening capability. The contractor reported hearing confidential medical discussions between patients and doctors, illegal drug transactions, business meetings, and intimate conversations between couples. Users were completely unaware that these private moments were being recorded and reviewed by human workers.
Following the Guardian's report, Apple immediately suspended the program globally and conducted an internal review. The company later announced changes to its practices, including making human review opt-in only and requiring explicit user consent. However, the damage to user privacy had already been done, with potentially millions of private conversations having been inadvertently recorded and reviewed over several years of the program's operation.
In 2021, a class action lawsuit was filed against Apple in the Northern District of California, alleging violations of privacy laws and wiretapping statutes. The plaintiffs argued that Apple failed to adequately inform users about the human review program and did not obtain proper consent for recording private conversations. In 2024, Apple agreed to pay $95 million to settle the lawsuit, with eligible users able to claim up to $20 per device for the privacy violations.
Root Cause
Siri's voice activation system was triggering recordings during private conversations due to false wake word detection, and these recordings were being sent to human contractors for quality assurance without adequate privacy safeguards or user consent.
Mitigation Analysis
Stronger wake word detection algorithms could have reduced false activations. Clear user consent mechanisms and opt-out options for human review programs would have addressed privacy concerns. Data minimization practices, such as automatic deletion of accidental recordings and anonymization of review data, could have prevented exposure of sensitive personal information.
Litigation Outcome
$95 million class action settlement approved in 2024 for users whose private conversations were recorded and reviewed by human contractors
Lessons Learned
This incident highlights the critical need for robust consent mechanisms and privacy safeguards in AI systems that process personal data. False wake word detection in voice assistants poses significant privacy risks when combined with human review processes.
Sources
Apple contractors 'regularly hear confidential details' on Siri recordings
The Guardian · Jul 26, 2019 · news
Apple agrees to pay $95 million to settle Siri privacy lawsuit
Reuters · Jan 5, 2024 · news