← Back to incidents
AI Emotion Detection Technology Used in Hiring Despite Lack of Scientific Validity
HighThe AI Now Institute found that AI emotion detection technology lacked scientific validity yet was being used by companies for hiring decisions, potentially creating unfair bias against job candidates.
Category
Bias
Industry
HR / Recruiting
Status
Resolved
Date Occurred
Jan 1, 2019
Date Reported
Dec 11, 2019
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
bias discrimination
People Affected
10,000
Human Review in Place
No
Litigation Filed
No
emotion_detectionhiring_biasscientific_validityemployment_discriminationalgorithmic_biasfacial_recognition
Full Description
In December 2019, the AI Now Institute released a comprehensive report in collaboration with the Association for Psychological Science examining the scientific foundations of AI emotion recognition technology. The research revealed that despite widespread commercial adoption, these systems lacked reliable scientific basis for accurately detecting emotions from facial expressions, voice patterns, or other behavioral indicators.
The report found that numerous companies were deploying emotion detection AI in high-stakes contexts including job interviews, security screenings, and educational assessments. These systems claimed to analyze facial micro-expressions, vocal tone, and body language to determine candidates' emotional states, personality traits, and suitability for roles. However, the underlying science showed significant cultural, individual, and contextual variations in emotional expression that made such determinations unreliable.
Companies like HireVue had integrated emotion detection capabilities into their video interviewing platforms, which were used by major corporations including Hilton, Goldman Sachs, and Unilever to screen thousands of job applicants. The technology purported to assess candidates' emotional intelligence, stress responses, and cultural fit based on algorithmic analysis of their recorded interview responses.
The AI Now Institute's findings highlighted that emotion detection systems often reflected cultural biases and failed to account for neurodivergent individuals, people with disabilities, or those from different cultural backgrounds who might express emotions differently. This created potential for discriminatory outcomes in hiring processes, with certain groups being systematically disadvantaged by flawed algorithmic assessments.
Following the report's publication, several companies began reconsidering their use of emotion detection technology. HireVue announced in January 2021 that it would discontinue the use of facial analysis in its hiring algorithms, though it maintained other algorithmic assessment features. The incident sparked broader discussions about the need for scientific validation of AI systems used in employment contexts and led to increased scrutiny from civil rights organizations and lawmakers.
Root Cause
AI emotion detection systems were deployed in hiring contexts despite fundamental scientific flaws in the underlying premise that emotions can be reliably detected from facial expressions across different cultural backgrounds and contexts.
Mitigation Analysis
Implementation of scientific validation requirements before deployment, mandatory bias testing across diverse populations, and human oversight of algorithmic hiring decisions could have prevented these systems from being used. Regular auditing by independent researchers and transparency requirements about algorithmic decision-making would have exposed these scientific limitations earlier.
Lessons Learned
The incident demonstrates the critical importance of requiring scientific validation before deploying AI systems in consequential decision-making contexts, particularly where bias and discrimination risks are high.
Sources
AI Now 2019 Report
AI Now Institute · Dec 11, 2019 · academic paper
An AI hiring firm says it can predict job performance from facial expressions. Critics say that's 'pseudoscience.'
Washington Post · Oct 22, 2019 · news