← Back to incidents

South Wales Police Facial Recognition System Had 92% False Positive Rate at Champions League Final

High

South Wales Police facial recognition system at 2018 Champions League final falsely identified 2,470 innocent people with 92% false positive rate. Court later ruled the deployment unlawful, violating human rights.

Category
Safety Failure
Industry
Government
Status
Resolved
Date Occurred
Jun 3, 2018
Date Reported
Jul 5, 2018
Jurisdiction
UK
AI Provider
Other/Unknown
Application Type
embedded
Harm Type
legal
People Affected
2,470
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
judgment plaintiff
Regulatory Body
Information Commissioner's Office
facial_recognitionfalse_positivespolice_surveillancecivil_libertiessports_eventsmass_surveillancealgorithmic_biascourt_ruling

Full Description

On June 3, 2018, South Wales Police deployed facial recognition technology during the UEFA Champions League final between Real Madrid and Liverpool at the National Stadium of Wales in Cardiff. The system was designed to identify persons of interest from police databases among the crowd of football supporters. However, the deployment resulted in catastrophic failure with a 92% false positive rate, generating 2,470 incorrect matches from approximately 2,680 total alerts. The facial recognition system compared faces captured by CCTV cameras against a database of individuals wanted by police or considered persons of interest. Despite human officers reviewing the alerts, the sheer volume of false positives overwhelmed the verification process. The system's poor performance was attributed to challenging environmental conditions including crowd density, varying lighting, and suboptimal camera angles that are typical of large public events. Civil liberties organizations, led by Ed Bridges with support from Liberty, challenged South Wales Police's use of facial recognition technology in court. The legal challenge argued that the deployment violated privacy rights, was discriminatory, and lacked proper legal framework. Evidence presented showed the technology had particular accuracy problems with women and ethnic minorities, raising concerns about algorithmic bias. In September 2020, the Court of Appeal ruled that South Wales Police's use of facial recognition was unlawful on multiple grounds. The court found violations of human rights legislation, data protection laws, and equality duties. The judgment criticized the lack of clear policies governing when and how the technology could be used, inadequate impact assessments, and failure to properly consider the technology's discriminatory effects. The ruling had significant implications beyond Wales, as it was the first successful legal challenge to police use of facial recognition in the UK. The judgment established important precedents about the need for proper legal frameworks, impact assessments, and safeguards when deploying biometric surveillance technologies in public spaces. South Wales Police subsequently suspended their facial recognition program pending policy reviews. The incident highlighted broader concerns about the deployment of facial recognition at entertainment venues and sporting events. Similar systems had been deployed at various US venues, including Madison Square Garden and some NFL stadiums, often without public disclosure. The Cardiff case became a landmark example of how facial recognition technology's limitations and biases can create mass false accusations when deployed in challenging real-world conditions.

Root Cause

The facial recognition system suffered from extremely high false positive rates due to poor algorithm accuracy, inadequate training data, and inappropriate deployment in crowded public settings with varying lighting conditions and camera angles.

Mitigation Analysis

While human officers reviewed alerts, the overwhelming volume of false positives (2,470 incorrect matches) made effective human oversight impossible. Better accuracy thresholds, demographic bias testing, and controlled deployment environments could have reduced false positives. Independent algorithmic auditing and stricter legal frameworks for biometric surveillance would have prevented unlawful deployment.

Litigation Outcome

Court of Appeal ruled in 2020 that South Wales Police's use of facial recognition technology was unlawful, violating human rights and data protection laws

Lessons Learned

The incident demonstrates that facial recognition technology remains unreliable for mass surveillance applications, particularly in challenging environments like crowded public events. It established important legal precedents requiring proper oversight, impact assessments, and legal frameworks before deploying biometric surveillance.

Sources

Major win against dystopian facial recognition
Liberty · Aug 11, 2020 · company statement
R (Bridges) v Chief Constable of South Wales Police
Courts and Tribunals Judiciary · Aug 11, 2020 · court filing