← Back to incidents

Amazon Same-Day Delivery Algorithm Systematically Excluded Predominantly Black Neighborhoods

High

Bloomberg investigation revealed Amazon's same-day delivery algorithm systematically excluded predominantly Black neighborhoods in major US cities, creating patterns nearly identical to historical redlining maps.

Category
Bias
Industry
Technology
Status
Resolved
Date Occurred
Jan 1, 2015
Date Reported
Apr 21, 2016
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
other
Harm Type
operational
People Affected
1,000,000
Human Review in Place
No
Litigation Filed
No
algorithmic_biasracial_discriminationdelivery_servicesredlininggeographic_biasamazonsame_day_deliveryoptimization_algorithms

Full Description

In April 2016, Bloomberg published an investigation revealing that Amazon's same-day delivery service systematically excluded predominantly Black neighborhoods across major US cities including Atlanta, Boston, Chicago, Dallas, and Washington DC. The investigation analyzed ZIP code-level delivery availability data and found stark racial disparities in service coverage that closely mirrored historical redlining practices from the 1930s. The Bloomberg analysis examined same-day delivery availability for Amazon Prime members across metropolitan areas and found that neighborhoods with higher percentages of Black residents were significantly less likely to receive same-day delivery service. In Atlanta, the service was unavailable in neighborhoods that were 65% Black on average, while being available in neighborhoods that were 35% Black. Similar patterns emerged in other cities, with delivery zones consistently excluding areas with higher concentrations of Black residents. The delivery zone boundaries appeared to follow algorithmic decisions based on factors such as delivery density, logistics efficiency, and potential profitability. However, these optimization criteria inadvertently recreated discriminatory patterns that had been formally outlawed decades earlier. The investigation noted that the excluded areas often corresponded almost exactly to neighborhoods that had been redlined by the Federal Housing Administration in the 1930s, suggesting that historical discrimination had created persistent geographic patterns that algorithms could perpetuate. Amazon initially defended its delivery zones as based purely on logistical factors, stating that same-day delivery was only available in areas where the company could efficiently fulfill orders. The company emphasized that the service was relatively new and that they planned to expand coverage over time. However, critics argued that algorithmic optimization without consideration of demographic impact amounted to digital redlining, where modern technology recreated historical patterns of exclusion. Following the Bloomberg investigation and subsequent public pressure, Amazon began expanding same-day delivery coverage to previously excluded areas. The company modified its delivery zone algorithms and increased infrastructure investment in underserved neighborhoods. By 2017, Amazon had significantly expanded coverage in many of the areas highlighted in the original investigation, though gaps in service availability persisted in some regions. The incident highlighted broader concerns about algorithmic bias in automated decision-making systems, particularly when optimization algorithms interact with geographic data that reflects historical discrimination. The case became a prominent example of how seemingly neutral algorithmic criteria could perpetuate systemic inequalities, leading to increased scrutiny of algorithmic decision-making in various industries and calls for proactive bias testing in automated systems.

Root Cause

Amazon's delivery zone algorithms optimized for efficiency and profitability inadvertently recreated historical redlining patterns by excluding areas with lower delivery density or perceived profitability, which correlated with predominantly Black neighborhoods.

Mitigation Analysis

Regular algorithmic auditing for demographic bias could have identified the discriminatory patterns before public exposure. Geographic equity testing during algorithm development and deployment of bias detection monitoring systems would have flagged the correlation with historical redlining. Human oversight of delivery zone boundaries with explicit consideration of demographic impact could have prevented systematic exclusion.

Lessons Learned

The incident demonstrated how algorithmic optimization without demographic consideration can perpetuate historical discrimination patterns, highlighting the need for proactive bias testing in automated decision-making systems that affect service delivery and access.
Amazon Same-Day Delivery Algorithm Systematically Excluded Predominantly Black Neighborhoods | Provyn Index