← Back to incidents
AI Proctoring Software Privacy Violations and Student Surveillance Lawsuits
HighAI proctoring software used during COVID-19 remote learning recorded students in bedrooms and private spaces, leading to privacy lawsuits and many universities discontinuing the technology.
Category
Privacy Leak
Industry
Education
Status
Resolved
Date Occurred
Mar 1, 2020
Date Reported
Sep 15, 2020
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
privacy
Estimated Cost
$5,000,000
People Affected
500,000
Human Review in Place
Yes
Litigation Filed
Yes
Litigation Status
settled
proctoringstudent_privacybiometric_dataremote_learningcovid19surveillanceclass_actionBIPA
Full Description
During the COVID-19 pandemic shift to remote learning in March 2020, universities rapidly deployed AI-powered exam proctoring software including Proctorio, ExamSoft, Respondus Monitor, and HonorLock to maintain exam integrity. These systems required students to install software that captured continuous video and audio feeds of their testing environment, monitored eye movements and keyboard activity, and in some cases recorded entire rooms including bedrooms where students lived.
The proctoring software employed AI algorithms to flag 'suspicious' behavior such as looking away from the screen, background noise, or movement detected in the room. Flagged incidents were then reviewed by human monitors employed by the proctoring companies, who could view recordings of students in their most private spaces. Students reported feeling violated as recordings captured family members, roommates, and personal living conditions. The software also exhibited bias issues, flagging students with disabilities, those in noisy environments, or students of color at higher rates.
By September 2020, privacy advocates and student groups began filing class action lawsuits against proctoring companies and universities. The lawsuit against Proctorio alleged violations of the Illinois Biometric Information Privacy Act (BIPA) and federal privacy laws, claiming the company collected facial recognition data and other biometric information without proper consent. Similar suits were filed against other proctoring vendors across multiple states.
Universities began facing significant student and faculty backlash over the invasive nature of the technology. Students documented cases where they were required to show their entire bedroom to cameras, remove religious head coverings, or submit to room scans that captured personal items and living conditions. Disability rights advocates highlighted how the software discriminated against students with conditions affecting eye movement, attention, or motor skills.
The incident reached a tipping point when several high-profile universities including Georgetown Law School, University of Illinois, and others announced they would discontinue or significantly limit use of AI proctoring software. Class action settlements were reached with multiple proctoring companies, though specific terms were often sealed. The controversy sparked broader discussions about digital privacy rights in education and appropriate boundaries for AI surveillance in academic settings.
Root Cause
AI proctoring software required continuous video and audio monitoring of students' private spaces during remote exams, with recordings stored and reviewed by third-party human monitors without adequate privacy safeguards or student consent mechanisms.
Mitigation Analysis
Privacy-preserving exam integrity could be implemented through local device monitoring without cloud storage, encrypted recordings with automatic deletion, student consent mechanisms for specific monitoring features, and alternative assessment methods that reduce reliance on invasive surveillance. Differential privacy techniques and on-device AI could maintain exam security while protecting student privacy.
Litigation Outcome
Multiple class action settlements reached with proctoring companies, with some universities discontinuing use of invasive proctoring software
Lessons Learned
The incident demonstrates how emergency technology adoption without privacy impact assessments can lead to significant violations of student rights. It highlights the need for educational institutions to balance academic integrity with privacy protection, and the importance of involving stakeholders in technology decisions that affect personal privacy.
Sources
Students revolt against exam-monitoring software that they say 'promotes ableism and violates privacy'
The Washington Post · Nov 1, 2020 · news
Students Are Pushing Back Against Proctoring Surveillance Apps
Electronic Frontier Foundation · Sep 25, 2020 · academic paper
Students Sue Proctorio After Online Proctoring Experience
Inside Higher Ed · Feb 11, 2021 · news