← Back to incidents
ShotSpotter AI Gunshot Detection System Led to Wrongful Police Raids and Community Harm
HighShotSpotter's AI gunshot detection system exhibited false positive rates of 86-95%, leading to wrongful police raids and discriminatory enforcement in predominantly Black neighborhoods across multiple US cities.
Category
Bias
Industry
Government
Status
Ongoing
Date Occurred
Jan 1, 2019
Date Reported
May 1, 2021
Jurisdiction
US
AI Provider
Other/Unknown
Model
ShotSpotter FLEX
Application Type
embedded
Harm Type
legal
Estimated Cost
$15,000,000
People Affected
50,000
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
pending
law_enforcementacoustic_detectionfalse_positivesracial_biasmunicipal_contractscivil_rights
Full Description
ShotSpotter, an acoustic gunshot detection system deployed in over 100 US cities, uses AI algorithms to analyze audio captured by networks of microphones to identify potential gunshots and alert police. The system was marketed as providing rapid response capabilities to gun violence, with contracts worth millions of dollars in cities like Chicago, New York, and San Francisco.
Investigative reporting and academic studies beginning in 2021 revealed severe accuracy problems with the ShotSpotter system. The MacArthur Justice Center found that in Chicago, 89% of ShotSpotter alerts in 2019-2020 resulted in police finding no evidence of gunfire. A separate study by the City of Chicago Inspector General found that only 9% of ShotSpotter alerts led to evidence of a gun crime. The system frequently misidentified fireworks, car backfires, construction sounds, and other loud noises as gunshots.
The deployment pattern of ShotSpotter sensors disproportionately covered predominantly Black and Latino neighborhoods, leading to discriminatory policing impacts. These false alerts triggered aggressive police responses including SWAT raids, with officers arriving expecting active shooters. In several documented cases, innocent residents were subjected to forced entries, property damage, and traumatic encounters with heavily armed police responding to non-existent gunfire.
Chicago terminated its $33 million ShotSpotter contract in 2024 following years of criticism and community advocacy. Other cities including San Antonio and Charlotte also ended their contracts after similar accuracy and bias concerns. Multiple civil rights lawsuits have been filed alleging that ShotSpotter enabled discriminatory policing practices that violated constitutional rights of residents in targeted neighborhoods.
The incident highlighted broader issues with deploying AI systems in law enforcement without adequate testing for bias and accuracy. ShotSpotter's proprietary algorithms were not subject to independent auditing, and the company reportedly modified audio classifications after police found no evidence of gunfire, raising questions about data integrity and feedback loops that could mask poor performance.
Root Cause
ShotSpotter's acoustic detection algorithms suffered from high false positive rates, with studies showing 86-95% of alerts did not correspond to actual gunshots. The system's audio classification models struggled to distinguish gunshots from similar sounds like fireworks, car backfires, and construction noise.
Mitigation Analysis
Better algorithmic testing across diverse acoustic environments could have revealed the high false positive rates. Stronger human verification protocols requiring multiple confirmatory signals before police deployment would have reduced harmful responses. Regular bias auditing of deployment patterns and outcomes could have identified discriminatory impacts on minority communities earlier.
Litigation Outcome
Multiple civil rights lawsuits filed against Chicago and other cities alleging discriminatory policing based on flawed AI detections
Lessons Learned
This case demonstrates the critical importance of independent validation of AI systems used in law enforcement, particularly regarding accuracy metrics and potential for disparate impact on minority communities. Proprietary algorithms deployed in public safety require transparency and oversight mechanisms to prevent discriminatory outcomes.
Sources
Chicago ends ShotSpotter contract amid criticism of gunshot detection system
Chicago Tribune · Feb 21, 2024 · news
Lessons from Chicago's Experience with ShotSpotter
MacArthur Justice Center · May 11, 2021 · academic paper