← Back to incidents
AI Proctoring Software Disproportionately Flagged Black Students as Cheating
HighAI proctoring software from companies like Proctorio flagged Black students as cheating at disproportionate rates due to facial recognition bias. Thousands of students faced false accusations during remote learning expansion in 2020.
Category
Bias
Industry
Education
Status
Resolved
Date Occurred
Mar 1, 2020
Date Reported
Aug 1, 2020
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
embedded
Harm Type
reputational
People Affected
10,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
settled
facial_recognitionbiaseducationproctoringracial_discriminationremote_learningcovid19
Full Description
During the COVID-19 pandemic in 2020, educational institutions rapidly adopted AI-powered remote proctoring software to maintain academic integrity during online exams. Companies like Proctorio, ExamSoft, and Respondus Monitor deployed facial recognition and behavioral monitoring systems to detect potential cheating. However, these systems exhibited significant racial bias in their operation.
The proctoring software used computer vision algorithms to track student eye movements, facial expressions, and body positioning during exams. Students flagged by the AI faced academic investigations, grade penalties, and in some cases formal academic dishonesty charges. Reports emerged that Black and dark-skinned students were being flagged at substantially higher rates than their white counterparts, often for the same behaviors.
The technical root cause stemmed from facial recognition algorithms trained predominantly on lighter-skinned faces, a common problem in computer vision known as algorithmic bias. The systems had difficulty detecting facial landmarks, tracking eye movements, and interpreting expressions on darker skin tones. This led to false interpretations of normal behavior as suspicious, such as looking away from the camera or making facial expressions that the AI couldn't properly categorize.
Students and advocacy groups began documenting cases where Black students received academic penalties based on AI flags, while white students exhibiting similar behaviors faced no consequences. The University of Illinois, University of California system, and other major institutions faced criticism and legal challenges over their use of biased proctoring technology. Some students reported psychological distress from being falsely accused of cheating and having to prove their innocence.
The incident highlighted broader issues with AI bias in high-stakes educational settings and the lack of adequate testing for algorithmic fairness before deployment. Multiple lawsuits were filed against both universities and proctoring companies, leading to policy changes, settlements, and increased scrutiny of AI-powered educational tools. Some institutions suspended use of certain proctoring software or implemented additional human oversight processes.
Root Cause
Facial recognition algorithms in proctoring software were trained primarily on lighter-skinned faces, resulting in poor performance on darker skin tones. The systems struggled to detect facial landmarks and eye movements on Black students, leading to false positives for suspicious behavior.
Mitigation Analysis
Diverse training datasets representing all skin tones could have prevented this bias. Human review of flagged cases before academic sanctions would have caught false positives. Regular bias testing across demographic groups and algorithmic auditing would have identified the disparity earlier.
Litigation Outcome
Multiple lawsuits filed against universities and proctoring companies, with some resulting in policy changes and discrimination settlements.
Lessons Learned
This incident demonstrates the critical importance of bias testing in AI systems used in high-stakes environments like education. It also highlights the need for human oversight in automated decision-making that can significantly impact individuals' academic and professional futures.
Sources
Students of color are getting flagged to their professors as 'suspicious' by proctoring software
The Verge · Aug 31, 2020 · news
Students Are Rebelling Against Eye-Tracking Exam Surveillance Tools
VICE · Aug 19, 2020 · news