← Back to incidents
Facebook Housing Ads Enabled Racial Discrimination Through Targeting System
HighFacebook's ad targeting system allowed advertisers to exclude users by race and ethnicity from housing ads, violating civil rights laws and resulting in a $115 million DOJ settlement.
Category
Bias
Industry
Technology
Status
Resolved
Date Occurred
Jan 1, 2016
Date Reported
Oct 28, 2016
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
embedded
Harm Type
legal
Estimated Cost
$115,000,000
People Affected
1,000,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
settled
Regulatory Body
Department of Justice and HUD
Fine Amount
$115,000,000
advertisingdiscriminationcivil_rightsalgorithmic_biashousingfair_housing_actfacebookmeta
Full Description
In October 2016, ProPublica revealed that Facebook's advertising platform allowed advertisers to exclude users from seeing housing and employment ads based on race, ethnicity, religion, and other protected characteristics. The investigation demonstrated how advertisers could use Facebook's "Ethnic Affinities" targeting feature to exclude groups like "African American," "Asian American," and "Hispanic" from seeing housing advertisements, directly violating the Fair Housing Act of 1968.
The discrimination was not limited to explicit exclusions. Facebook's machine learning algorithms amplified these biases through lookalike audience features, where the platform would automatically find users similar to those in targeted or excluded groups. This meant that even when advertisers did not explicitly exclude protected groups, the algorithm would learn discriminatory patterns and perpetuate them across broader audiences, effectively creating digital redlining.
Following the ProPublica investigation, the Department of Housing and Urban Development (HUD) filed a complaint against Facebook in March 2019, alleging systematic violations of the Fair Housing Act. The complaint detailed how Facebook's advertising tools enabled discrimination not just through explicit exclusions, but through zip code targeting that correlated with racial demographics and through algorithmic optimization that learned to exclude protected groups even without explicit instruction.
In June 2022, the Department of Justice announced a landmark $115 million settlement with Facebook (now Meta) to resolve the discrimination claims. Under the settlement terms, Facebook agreed to stop using an algorithm that the DOJ said discriminated against users based on protected characteristics. The company also committed to developing a new system for advertising that would prevent discrimination in housing, employment, and credit advertisements. Additionally, Facebook agreed to pay the maximum civil penalty allowed under the Fair Housing Act and fund programs to increase access to housing opportunities.
The settlement required Facebook to implement significant technical changes, including the creation of a new advertising system called the "Special Ad Audiences" tool for housing, employment, and credit ads. This system limits targeting options and uses machine learning to ensure ads reach diverse audiences regardless of advertiser preferences. Facebook also agreed to conduct regular civil rights audits and submit to ongoing monitoring by an independent third party to ensure compliance with fair housing laws.
Root Cause
Facebook's ad targeting algorithm allowed advertisers to explicitly exclude users from seeing ads based on protected characteristics including race, ethnicity, religion, and gender, with machine learning systems that amplified these exclusions by finding lookalike audiences.
Mitigation Analysis
Pre-deployment bias testing could have identified discriminatory targeting capabilities before launch. Human review of ad targeting options and automated screening for protected class exclusions would have prevented explicit discrimination. Regular algorithmic auditing and fairness testing could have detected the amplification effects in lookalike audience generation.
Litigation Outcome
Facebook agreed to pay $115 million settlement and implement comprehensive changes to ad targeting system to prevent discriminatory advertising
Lessons Learned
This incident demonstrates how algorithmic bias in advertising platforms can systematically violate civil rights laws even when discrimination is not the explicit intent. It highlights the need for proactive bias testing and ongoing monitoring of AI systems that affect access to essential services like housing and employment.
Sources
Facebook Lets Advertisers Exclude Users by Race
ProPublica · Oct 28, 2016 · news
Justice Department Secures Landmark Settlement Agreement with Meta
U.S. Department of Justice · Jun 21, 2022 · regulatory action
HUD Charges Facebook with Housing Discrimination Over Company's Targeted Advertising Practices
U.S. Department of Housing and Urban Development · Mar 28, 2019 · regulatory action