← Back to incidents

Facial Recognition at London King's Cross Station Operated Without Public Knowledge

High

Facial recognition cameras at London's King's Cross development operated without public knowledge for 18 months, processing millions of people's biometric data in violation of GDPR before being discovered and shut down.

Category
surveillance
Industry
Other
Status
Resolved
Date Occurred
Jan 1, 2018
Date Reported
Aug 11, 2019
Jurisdiction
UK
AI Provider
Other/Unknown
Application Type
other
Harm Type
privacy
People Affected
30,000,000
Human Review in Place
Unknown
Litigation Filed
No
Regulatory Body
Information Commissioner's Office (ICO)
facial_recognitiongdprprivacysurveillancebiometricpublic_spaceconsenttransparency

Full Description

In August 2019, it was revealed that facial recognition technology had been operating covertly at the King's Cross development in London, one of the UK's busiest transport hubs, for approximately 18 months. The system was deployed by developer Argent LLP across the 67-acre mixed-use development without any public notification or signage indicating that biometric surveillance was taking place. An estimated 30 million people annually pass through the King's Cross area, including the major railway station and surrounding commercial spaces. The facial recognition deployment came to light through investigative reporting by the Financial Times in August 2019. The revelation sparked immediate public outcry and privacy concerns, as visitors, commuters, workers, and residents had unknowingly had their biometric data captured and processed. The system was capable of identifying and tracking individuals across multiple camera locations throughout the development, creating detailed movement patterns and behavioral profiles. Following the public disclosure, the Information Commissioner's Office (ICO) launched an immediate investigation into potential violations of the General Data Protection Regulation (GDPR), which had come into effect in May 2018. The ICO's investigation focused on whether Argent LLP had a lawful basis for processing biometric data, whether proper impact assessments had been conducted, and whether data subjects' rights had been respected. The regulator also examined the proportionality of using such invasive technology in a public space. Under mounting pressure from regulators and the public, Argent LLP announced in August 2019 that it would cease using facial recognition technology across the King's Cross development. The company claimed the system had been used for security purposes and to improve visitor experience, but acknowledged that it had failed to properly inform the public about the data processing. The ICO's investigation concluded that the deployment violated multiple GDPR principles, including lawfulness, transparency, and data minimization. While no formal fine was issued, the case became a landmark example of privacy violations in the deployment of AI surveillance technology and led to increased scrutiny of facial recognition use in public spaces across the UK.

Root Cause

Developer Argent LLP deployed facial recognition technology across the King's Cross development without implementing proper legal basis, public notification, or consent mechanisms required under GDPR. The system operated covertly for approximately 18 months before public discovery.

Mitigation Analysis

Implementation of clear privacy impact assessments, mandatory public signage indicating biometric surveillance, explicit opt-in consent mechanisms, and regular compliance audits could have prevented this violation. Data minimization principles requiring justification for biometric processing and automated deletion schedules would have reduced harm scope.

Lessons Learned

The incident highlighted the critical importance of transparency and legal compliance when deploying biometric AI systems in public spaces. It demonstrated that technical capability does not justify deployment without proper legal basis and public consultation, and reinforced that GDPR's consent and transparency requirements apply equally to AI-powered surveillance systems.

Sources