← Back to incidents
ShotSpotter AI Gunshot Detection System Generated False Alerts Leading to Wrongful Raids and Arrests in Chicago
HighChicago's ShotSpotter AI gunshot detection system generated thousands of false alerts from 2017-2021, leading to unnecessary police raids and wrongful arrests. A MacArthur Justice Center study found 89% of alerts resulted in no gun crime evidence.
Category
Safety Failure
Industry
Government
Status
Resolved
Date Occurred
Jan 1, 2017
Date Reported
May 1, 2021
Jurisdiction
US
AI Provider
Other/Unknown
Model
ShotSpotter
Application Type
other
Harm Type
legal
Estimated Cost
$33,000,000
People Affected
40,000
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
settled
law_enforcementacoustic_detectionfalse_positivescivil_rightschicagogunshot_detectionpolice_technologyalgorithmic_bias
Full Description
From 2017 to 2021, Chicago deployed ShotSpotter's AI-powered acoustic gunshot detection system across 117 square miles of the city, predominantly in Black and Latino neighborhoods. The system used an array of acoustic sensors and machine learning algorithms to detect and triangulate potential gunshots, automatically alerting police dispatchers within seconds. Chicago paid ShotSpotter approximately $33 million over this period for the technology and services.
The MacArthur Justice Center's comprehensive 2021 analysis of Chicago Police Department data revealed severe performance issues with the system. Of the more than 40,000 ShotSpotter alerts generated over 21 months, 89% resulted in police responses that found no gun-related crime evidence. The study found that only 1% of alerts led to evidence that a gun crime had been committed. The false alerts were caused by the AI system misidentifying sounds such as fireworks, construction equipment, car backfires, and even helicopters as gunshots.
The most egregious case involved Michael Williams, a 65-year-old Black grandfather who was charged with murder based primarily on ShotSpotter evidence. Williams was arrested after the system detected what it classified as gunshots near his location, despite no other corroborating evidence. He spent nearly a year in jail before charges were dropped when it became clear the ShotSpotter evidence was unreliable. This case highlighted how the AI system's false positives could lead to serious criminal charges against innocent individuals.
The system's failures disproportionately impacted communities of color, where the sensors were primarily deployed. Residents reported increased police harassment and unnecessary confrontations stemming from false ShotSpotter alerts. Civil rights advocates argued that the technology violated Fourth Amendment protections against unreasonable searches and seizures, as police were conducting investigatory stops based on algorithmically generated alerts with extremely high false positive rates.
Following mounting criticism from community groups, civil rights organizations, and academic researchers, Chicago Mayor Lori Lightfoot announced in May 2021 that the city would not renew its ShotSpotter contract. The decision came after the MacArthur Justice Center study, combined with reporting from Chicago Sun-Times and WBEZ, demonstrated the system's poor performance and potential for constitutional violations. The contract officially expired in February 2024, ending Chicago's controversial use of AI-powered gunshot detection technology.
Root Cause
ShotSpotter's AI acoustic detection system had high false positive rates, misidentifying sounds like fireworks, car backfires, and construction noise as gunshots. The system's algorithms were poorly calibrated for urban environments and lacked sufficient training data to distinguish gunshots from similar acoustic signatures.
Mitigation Analysis
While ShotSpotter employed human analysts to review audio before sending alerts to police, this review process was insufficient and often rushed. Better mitigation would have required more rigorous acoustic validation, independent auditing of alert accuracy, requiring corroborating evidence before police deployment, and transparency in algorithm performance metrics by neighborhood demographics.
Litigation Outcome
Multiple civil rights lawsuits filed, including wrongful arrest cases, with some resulting in settlements and dropped charges
Lessons Learned
The Chicago ShotSpotter deployment demonstrates the risks of deploying AI systems in law enforcement without adequate accuracy standards, independent oversight, or consideration of civil rights implications. High false positive rates in AI detection systems can create cascading harms when integrated into police operations.
Sources
Noise and Bias: The Challenge of ShotSpotter
MacArthur Justice Center · May 1, 2021 · academic paper
Lightfoot administration won't renew ShotSpotter contract
Chicago Sun-Times · May 16, 2021 · news
Chicago ends ShotSpotter contract amid questions about effectiveness
WBEZ Chicago · Feb 16, 2024 · news