← Back to incidents
Amazon Rekognition Facial Recognition System Sold to Police Despite Known Racial Bias
CriticalAmazon sold its Rekognition facial recognition system to police departments from 2016-2020 despite documented racial bias that caused higher error rates for people of color. The company implemented a moratorium in 2020 following protests and employee pressure.
Category
Bias
Industry
Government
Status
Resolved
Date Occurred
Jan 1, 2016
Date Reported
Jul 26, 2018
Jurisdiction
US
AI Provider
Other/Unknown
Model
Amazon Rekognition
Application Type
api integration
Harm Type
legal
People Affected
50,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
pending
facial_recognitionracial_biaslaw_enforcementAmazoncivil_rightsalgorithmic_fairnesspolice_technology
Full Description
Amazon Web Services began marketing its Rekognition facial recognition service to law enforcement agencies in 2016, positioning it as a cost-effective solution for identifying suspects and monitoring public spaces. The service was adopted by multiple police departments including Orlando Police Department and Washington County Sheriff's Office in Oregon, with Amazon actively promoting its law enforcement applications at conferences and through direct sales efforts.
In July 2018, the American Civil Liberties Union conducted a test that exposed significant racial bias in Amazon Rekognition. The ACLU used the system to compare photos of all 535 members of Congress against a database of 25,000 publicly available arrest photos. The system incorrectly matched 28 members of Congress to criminal mugshots, with 39% of the false positives being people of color despite them comprising only 20% of Congress. The bias was particularly pronounced for darker-skinned women, who were misidentified at much higher rates.
Despite this public demonstration of bias and subsequent criticism from civil rights groups, Amazon continued to defend and sell Rekognition to law enforcement. Company executives argued that the ACLU's test used inappropriate confidence thresholds and that the technology was accurate when properly configured. However, internal Amazon documents later revealed that the company was aware of accuracy issues, particularly regarding racial bias, before the ACLU study. Amazon employees began organizing internally, with over 450 employees signing letters calling for the company to stop selling facial recognition to law enforcement.
The controversy intensified in 2020 following nationwide protests over police brutality and racial injustice after George Floyd's death. Amazon faced mounting pressure from employees, investors, and civil rights organizations. On June 10, 2020, Amazon announced a one-year moratorium on police use of Rekognition, stating it hoped Congress would implement appropriate rules for facial recognition technology. The company extended this moratorium indefinitely in May 2021, though it continues to allow use by organizations working to find missing children and combat human trafficking. The incident highlighted broader concerns about algorithmic bias in law enforcement technology and contributed to legislative efforts to regulate facial recognition systems.
Root Cause
Amazon's Rekognition facial recognition system demonstrated significantly higher error rates for darker-skinned individuals and women, with the system incorrectly matching 28 members of Congress to criminal mugshots, disproportionately affecting people of color. Despite internal and external documentation of these biases, Amazon continued marketing and selling the technology to law enforcement agencies.
Mitigation Analysis
Comprehensive bias testing across demographic groups before deployment could have identified the disparate error rates. Independent algorithmic auditing, mandatory human review of all matches before law enforcement action, and diverse training datasets could have reduced harm. Establishing ethical review boards and implementing confidence thresholds that account for demographic bias would have prevented many false positives.
Litigation Outcome
Multiple lawsuits filed regarding wrongful arrests based on facial recognition technology
Lessons Learned
The incident demonstrates how algorithmic bias can perpetuate and amplify existing inequalities in criminal justice systems. It underscores the critical need for comprehensive bias testing before deploying AI systems in high-stakes applications and the importance of ongoing monitoring and accountability measures.
Sources
Amazon's Face Recognition Falsely Matched 28 Members of Congress With Mugshots
American Civil Liberties Union · Jul 26, 2018 · company statement
We are implementing a one-year moratorium on police use of Rekognition
Amazon · Jun 10, 2020 · company statement
Amazon extends moratorium on police use of facial recognition software
Reuters · May 18, 2021 · news