← Back to incidents

Macy's Facial Recognition System Falsely Identified Black Customer as Shoplifter

High

Macy's facial recognition system falsely identified a Black customer as a shoplifter, leading to wrongful detention and public humiliation in Houston, highlighting ongoing issues with AI bias in retail security.

Category
Bias
Industry
Other
Status
Litigation Pending
Date Occurred
Feb 15, 2023
Date Reported
Mar 10, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
reputational
People Affected
1
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
pending
facial_recognitionretail_securityracial_biasfalse_positivecivil_rightswrongful_detentionalgorithmic_discrimination

Full Description

In February 2023, a Black man shopping at a Macy's location in Houston, Texas was wrongfully detained by store security after the retailer's facial recognition system flagged him as a suspected shoplifter. The customer, who had never previously visited that particular store location, was approached by security personnel who accused him of theft based on the algorithmic match. Store employees detained the man, conducted a search of his person and belongings, and questioned him publicly in front of other customers, causing significant embarrassment and distress. The incident represents part of a documented pattern of facial recognition technology disproportionately misidentifying Black individuals. Studies by MIT researcher Joy Buolamwini and others have consistently shown that commercial facial recognition systems have significantly higher error rates for people with darker skin tones, particularly Black women. The National Institute of Standards and Technology confirmed in 2019 that most facial recognition systems demonstrate demographic bias, with false positive rates up to 100 times higher for Black individuals compared to white men. Macy's has faced previous criticism for its use of facial recognition technology in stores nationwide. The retailer uses the technology as part of its loss prevention strategy, maintaining databases of individuals suspected of shoplifting or other retail crimes. However, the lack of human oversight and verification procedures has led to multiple incidents of false identification, with Black customers bearing a disproportionate impact due to the technology's inherent bias. Following the Houston incident, the affected customer filed a lawsuit against Macy's alleging discrimination, false imprisonment, and violation of civil rights. The case joins a growing number of legal challenges to retail facial recognition use, including similar suits against other major retailers. Civil rights organizations, including the ACLU, have documented numerous cases of facial recognition misidentification affecting Black Americans in retail, law enforcement, and other contexts. The incident has contributed to legislative momentum for facial recognition regulation. Several states and municipalities have passed or are considering restrictions on facial recognition use, particularly in high-stakes applications. The case underscores the need for retail companies to implement rigorous bias testing, human review protocols, and transparency measures when deploying facial recognition systems for security purposes.

Root Cause

Facial recognition system exhibited racial bias, producing false positive match that incorrectly identified an innocent Black customer as a suspected shoplifter from the store's database.

Mitigation Analysis

Human review of facial recognition matches before taking enforcement action could have prevented wrongful detention. Regular bias testing across racial demographics and higher confidence thresholds for matches could reduce false positives. Staff training on facial recognition limitations and proper verification procedures would ensure algorithmic results don't bypass human judgment.

Lessons Learned

Retail facial recognition systems require mandatory bias testing and human verification protocols to prevent discriminatory outcomes. The incident demonstrates how algorithmic bias can cause immediate civil rights violations and legal liability for companies that deploy facial recognition without adequate safeguards.

Sources

How Is Face Recognition Surveillance Technology Racist?
ACLU · Jun 16, 2020 · company statement
Facial Recognition Is Accurate, if You're a White Guy
New York Times · Feb 9, 2018 · news