← Back to incidents

Amazon Alexa Contractors Listened to Private User Conversations Without Consent

High

Bloomberg revealed Amazon employed thousands of contractors worldwide to listen to Alexa recordings from users' homes for speech training, exposing private conversations without adequate user consent and leading to privacy lawsuits.

Category
Privacy Leak
Industry
Technology
Status
Resolved
Date Occurred
Jan 1, 2019
Date Reported
Apr 10, 2019
Jurisdiction
US
AI Provider
Other/Unknown
Model
Alexa Voice Service
Application Type
agent
Harm Type
privacy
Estimated Cost
$50,000,000
People Affected
1,000,000
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
settled
Regulatory Body
Federal Trade Commission
privacyvoice_assistanthuman_reviewdata_collectionconsentsmart_homesurveillance

Full Description

In April 2019, Bloomberg published an investigation revealing that Amazon employed thousands of full-time workers and contractors around the world to listen to voice recordings captured by Echo devices and the Alexa voice assistant. The program, which Amazon called necessary for improving Alexa's speech recognition and natural language processing capabilities, involved teams in Boston, Costa Rica, India, and Romania who reviewed as many as 1,000 audio clips per nine-hour shift. These contractors heard a wide range of private conversations, including what appeared to be sexual encounters, family arguments, business deals, and other intimate moments that users never intended to share. The contractors worked in poor conditions, often in windowless buildings, and were required to transcribe and annotate the recordings to help train Amazon's artificial intelligence algorithms. They were instructed to flag content that was "amusing" and shared these clips with colleagues via internal chat systems. Some recordings contained sensitive information including medical conditions, drug deals, and domestic disputes. The workers reported hearing distressing content including potential criminal activity but were given no clear guidance on how to handle such situations or whether to report them to authorities. Amazon's response to the Bloomberg investigation was initially defensive, with the company stating that the human review process was critical to improving the customer experience and that only a small fraction of recordings were reviewed by humans. However, the company failed to adequately disclose this practice to users in clear terms. While Amazon's privacy policy mentioned that recordings might be used to improve services, it did not explicitly state that human contractors would be listening to private conversations in users' homes. The revelation sparked immediate backlash from privacy advocates, lawmakers, and users. Multiple class-action lawsuits were filed against Amazon, alleging violations of wiretapping laws and privacy rights. The lawsuits claimed that Amazon violated state and federal privacy laws by recording and sharing private conversations without proper consent. Congressional representatives called for investigations, and the incident contributed to broader scrutiny of big tech companies' data collection practices. Amazon eventually settled the lawsuits for undisclosed amounts and implemented changes including clearer opt-out options for human review and improved user controls over voice recordings.

Root Cause

Amazon's Alexa system was designed to record and transmit user conversations to human contractors for speech recognition training without adequate user disclosure or consent mechanisms, and lacked proper anonymization of sensitive recordings.

Mitigation Analysis

Enhanced privacy controls including explicit opt-out mechanisms for human review, better data anonymization protocols, and clearer user consent processes could have prevented this breach. Implementing automatic detection of sensitive content before human review and stricter contractor access controls would have reduced exposure risk.

Litigation Outcome

Amazon settled multiple class-action lawsuits for undisclosed amounts and agreed to implement stronger privacy protections and user consent mechanisms

Lessons Learned

This incident highlighted the need for transparent disclosure of human involvement in AI training processes and the importance of explicit user consent for sensitive data processing. It demonstrated that companies must balance AI improvement needs with robust privacy protections and user control.

Sources

Alexa has been eavesdropping on you this whole time
Washington Post · May 6, 2019 · news