← Back to incidents
Amazon's AI Recruitment Tool Systematically Discriminated Against Female Candidates
HighAmazon's AI recruiting tool trained on historical data systematically downgraded female candidates, penalizing resumes mentioning women's colleges, organizations, and using female-associated language patterns.
Category
Bias
Industry
Technology
Status
Resolved
Date Occurred
Jan 1, 2014
Date Reported
Oct 10, 2018
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
api integration
Harm Type
reputational
People Affected
10,000
Human Review in Place
Yes
Litigation Filed
No
Regulatory Body
Equal Employment Opportunity Commission
hiring biasgender discriminationword embeddingsAmazonalgorithmic biasrecruitment AIEEOCemployment law
Full Description
Between 2014 and 2017, Amazon developed an AI-powered recruiting tool intended to automate the resume screening process for technical positions. The system was designed to review resumes and rank candidates on a scale of one to five stars, with the goal of identifying the most promising applicants for software engineering and other technical roles. However, the tool exhibited systematic bias against female candidates due to its training methodology.
The AI system was trained on resume data submitted to Amazon over a 10-year period, during which the company's technical workforce was overwhelmingly male. This historical data taught the algorithm to favor patterns associated with male candidates. Specifically, the system penalized resumes that included the word 'women's,' such as 'women's chess club captain' or graduates of women's colleges. The tool also downgraded candidates who listed participation in women's professional organizations or used language patterns statistically associated with female communication styles.
Amazon's machine learning specialists discovered these biases during testing in 2015 and attempted to correct the system by editing the algorithm to treat certain gender-specific terms neutrally. However, engineers could not guarantee that the system wouldn't continue to discriminate in more subtle ways, as the underlying word embeddings and pattern recognition remained based on biased historical data. The company's attempts to debias the system proved insufficient to eliminate all forms of gender discrimination.
Reuters first reported the existence and problems with Amazon's recruiting tool in October 2018, revealing that the company had effectively scrapped the system by 2017 due to its inability to solve the bias issues. The revelation sparked broader scrutiny of AI bias in hiring practices across the technology industry. Amazon stated that the recommendations from the tool were never used as the sole basis for hiring decisions and that human recruiters maintained oversight of the process.
The incident highlighted fundamental problems with training AI systems on historical data that reflects past discrimination. Research subsequent to the Amazon case demonstrated that word embeddings commonly used in natural language processing systematically encode gender, racial, and other biases present in their training corpora. This has led to increased regulatory attention and the development of algorithmic auditing requirements in jurisdictions including New York City and the European Union.
Root Cause
The AI system was trained on historical resume data from a male-dominated tech industry, causing it to learn that male-associated patterns correlated with hiring success and systematically penalize female-associated language and experiences.
Mitigation Analysis
Bias testing during development could have identified discriminatory patterns before deployment. Diverse training data that corrects for historical biases, demographic parity constraints, and ongoing bias monitoring would have prevented systematic discrimination. Regular audits of algorithmic outputs across protected classes should be mandatory for hiring tools.
Lessons Learned
Historical training data perpetuates and amplifies existing societal biases in AI systems. Technical debiasing methods alone are insufficient without addressing underlying data representation issues and implementing comprehensive bias testing protocols.
Sources
Amazon scraps secret AI recruiting tool that showed bias against women
Reuters · Oct 10, 2018 · news
Amazon built an AI tool for hiring that was biased against women
Washington Post · Oct 10, 2018 · news