← Back to incidents

AI Surveillance Cameras in Serbian Schools Monitored Student Behavior Without Proper Consent

High

AI surveillance cameras in Serbian schools monitored student emotions and behavior without proper consent from students or parents. Digital rights groups successfully challenged the practice, leading to removal of the surveillance system.

Category
Privacy Leak
Industry
Education
Status
Resolved
Date Occurred
Sep 1, 2022
Date Reported
Dec 15, 2022
Jurisdiction
Europe
AI Provider
Other/Unknown
Application Type
embedded
Harm Type
privacy
People Affected
6,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
settled
Regulatory Body
Serbian Commissioner for Information of Public Importance and Personal Data Protection
surveillanceeducationprivacychildrenfacial_recognitionemotion_detectionGDPRSerbiaconsent

Full Description

In September 2022, the Serbian Ministry of Education implemented an AI-powered surveillance system across multiple schools in the country as part of a pilot program intended to monitor student attendance, behavior, and emotional states. The system used facial recognition technology and emotion detection algorithms to track students throughout school premises, analyzing their movements, interactions, and facial expressions in real-time. The surveillance program was deployed without obtaining proper informed consent from students or their parents, violating fundamental privacy principles under both Serbian national law and EU data protection regulations that Serbia has committed to align with as part of its EU accession process. The cameras were capable of identifying individual students, tracking their location within schools, and creating detailed behavioral profiles based on their emotional responses and social interactions. Digital rights organizations, including the Belgrade-based SHARE Foundation and other civil liberties groups, filed formal complaints with Serbian data protection authorities in December 2022. They argued that the surveillance system violated children's rights to privacy and created a harmful environment where students were under constant monitoring. Parents and teachers also raised concerns about the psychological impact on students, noting increased anxiety and self-censorship behaviors. The Serbian Commissioner for Information of Public Importance and Personal Data Protection launched an investigation into the surveillance program following the complaints. The investigation found significant violations of data protection laws, including lack of proper legal basis for processing children's personal data, absence of meaningful consent, and failure to conduct required privacy impact assessments. The system also lacked adequate data security measures and transparency about how the collected information would be used, stored, or shared. Following regulatory pressure and sustained advocacy by digital rights groups, the Serbian Ministry of Education agreed to suspend the surveillance program in early 2023. The AI cameras were removed from participating schools, and collected data was reportedly deleted, though verification of complete data destruction remains unclear. The incident highlighted the need for stronger protections for children's privacy rights in educational technology deployments.

Root Cause

Implementation of AI-powered facial recognition and emotion detection surveillance system in schools without obtaining proper informed consent from students and parents, and without adequate legal basis under GDPR and Serbian data protection laws.

Mitigation Analysis

This incident could have been prevented through proper privacy impact assessments, meaningful consent processes for all stakeholders including minors, and legal review ensuring GDPR compliance. Data minimization principles should have limited collection to only necessary data with clear educational purposes. Independent oversight and transparent policies were also missing.

Litigation Outcome

Legal challenge by digital rights groups led to suspension of the surveillance program and removal of cameras

Lessons Learned

This case demonstrates the critical importance of privacy-by-design principles when implementing AI surveillance in educational settings, particularly involving minors. It underscores the need for robust consent mechanisms, regulatory oversight, and meaningful stakeholder engagement before deploying invasive monitoring technologies.

Sources