← Back to incidents

Workday AI Hiring System Sued for Age and Disability Discrimination

High

A 2023 class action lawsuit alleged that Workday's AI-powered hiring screening tools systematically discriminated against older workers and disabled applicants, marking a significant case targeting the HR technology vendor rather than just employers.

Category
Bias
Industry
HR / Recruiting
Status
Litigation Pending
Date Occurred
Jan 1, 2020
Date Reported
Jan 31, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
agent
Harm Type
legal
People Affected
100,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
pending
hiringdiscriminationbiasemploymentage_discriminationdisabilityclass_actionhr_techworkdayalgorithmic_bias

Full Description

In January 2023, a class action lawsuit was filed against Workday Inc., alleging that the company's artificial intelligence-powered hiring and recruitment tools systematically discriminated against job applicants based on age and disability status. The lawsuit, filed in federal court, represents a landmark case as it directly targets the HR software vendor rather than the employers who use the technology, establishing potential liability for AI companies whose tools enable discriminatory practices. The plaintiffs alleged that Workday's AI screening algorithms used in its applicant tracking and hiring systems created barriers for workers over 40 and individuals with disabilities, violating the Age Discrimination in Employment Act (ADEA) and the Americans with Disabilities Act (ADA). The lawsuit claimed that these algorithms analyzed factors that served as proxies for age and disability, such as employment gaps, career progression patterns, educational timelines, and other biographical data that indirectly revealed protected characteristics. The case affects potentially hundreds of thousands of job applicants, as Workday's hiring platform is used by numerous Fortune 500 companies and large employers across various industries. The lawsuit seeks to represent a nationwide class of affected individuals who were allegedly screened out by Workday's AI tools between 2020 and the present. Plaintiffs argued that the company failed to test its algorithms for discriminatory impact and did not implement adequate safeguards to prevent bias against protected groups. Workday denied the allegations and stated that its products are designed to promote fair and equitable hiring practices. However, the case highlights broader concerns about algorithmic bias in employment decisions and the responsibility of AI vendors to ensure their tools comply with civil rights laws. The lawsuit seeks damages, injunctive relief requiring Workday to modify its algorithms, and ongoing monitoring of the company's hiring tools for discriminatory effects. This litigation represents part of a growing trend of legal challenges to AI-powered employment tools, following similar concerns raised about automated resume screening, personality assessments, and video interviewing platforms. The case could establish important precedents regarding vendor liability for discriminatory AI tools and the standards required for algorithmic fairness testing in employment contexts.

Root Cause

Workday's AI algorithms allegedly used proxies for age and disability status in screening decisions, potentially through analysis of employment gaps, career progression patterns, or other indirect indicators that correlate with protected characteristics.

Mitigation Analysis

This incident could have been prevented through algorithmic bias testing specifically for protected characteristics, regular auditing of hiring outcomes by demographic groups, and implementation of human review checkpoints for screening decisions. Workday should have conducted disparate impact analysis during development and deployment, and maintained ongoing monitoring of demographic patterns in hiring recommendations.

Lessons Learned

This case demonstrates that AI vendors can face direct liability for discriminatory algorithms, not just the companies that deploy them. It underscores the critical importance of comprehensive bias testing and ongoing monitoring of AI tools used in high-stakes decisions affecting protected groups.
Workday AI Hiring System Sued for Age and Disability Discrimination | Provyn Index