← Back to incidents
Tesla Autopilot Phantom Braking Investigation by NHTSA
HighNHTSA investigated over 750 complaints of Tesla Autopilot phantom braking affecting 416,000 vehicles. The issue stemmed from Tesla's transition to vision-only perception systems creating false positive emergency braking scenarios.
Category
Safety Failure
Industry
Technology
Status
Resolved
Date Occurred
Jan 1, 2021
Date Reported
Feb 16, 2022
Jurisdiction
US
AI Provider
Other/Unknown
Model
Tesla Autopilot/Full Self-Driving
Application Type
embedded
Harm Type
physical
Estimated Cost
$50,000,000
People Affected
416,000
Human Review in Place
No
Litigation Filed
Yes
Litigation Status
pending
Regulatory Body
National Highway Traffic Safety Administration (NHTSA)
autopilotphantom_brakingteslanhtsahighway_safetyvision_aiemergency_brakingautomotive
Full Description
In February 2022, the National Highway Traffic Safety Administration (NHTSA) opened formal investigation PE 22-004 into reports of sudden, unexpected braking in Tesla Model 3 and Model Y vehicles equipped with Autopilot. The investigation covered approximately 416,000 vehicles from model years 2021-2022, making it one of the largest automotive safety investigations in recent years.
The phantom braking incidents began increasing significantly in late 2021, coinciding with Tesla's decision to remove radar sensors from its vehicles and rely solely on camera-based vision systems for Autopilot functionality. NHTSA received over 750 complaints from drivers reporting that their vehicles would suddenly decelerate from highway speeds without warning, often dropping 10-20 mph instantly. These incidents typically occurred on highways at speeds above 55 mph, with drivers reporting the braking happened when passing under overpasses, encountering shadows, or in areas with no visible obstacles.
The sudden braking events created significant safety hazards, as following vehicles were often unable to react quickly enough to avoid rear-end collisions. NHTSA documented multiple crashes and injuries directly attributed to the phantom braking phenomenon. Drivers reported being rear-ended, having to swerve into other lanes, and experiencing close calls that could have resulted in multi-vehicle accidents. The incidents were particularly dangerous because they were unpredictable and occurred during high-speed highway driving.
Tesla's internal data showed that the company was aware of the increased phantom braking incidents following the removal of radar sensors. The vision-only system, while theoretically more advanced, had difficulty accurately interpreting certain visual scenarios. Shadows from overpasses, bridges, and large vehicles were being misidentified as solid objects requiring emergency braking. The neural networks had not been sufficiently trained to distinguish between actual obstacles and visual artifacts that posed no collision risk.
Throughout 2022, Tesla released multiple over-the-air software updates attempting to address the phantom braking issue. These updates refined the vision system's object detection algorithms and adjusted the sensitivity of emergency braking scenarios. However, complaints continued to be filed with NHTSA even after several software iterations. The company eventually acknowledged the issue and worked to improve the system's accuracy, though many affected customers had already experienced dangerous incidents.
In June 2023, NHTSA closed its investigation after determining that Tesla's software updates had substantially reduced the frequency of phantom braking events. However, the agency noted that the issue highlighted significant challenges with deploying advanced driver assistance systems without adequate real-world validation. Multiple class action lawsuits remain pending against Tesla, with plaintiffs alleging that the company knowingly deployed defective software that endangered public safety.
Root Cause
Tesla's transition from radar-based to vision-only (camera-based) perception system in 2021 created false positive object detection, causing the Autopilot system to apply emergency braking when no obstacles were present. The neural networks misidentified shadows, overpasses, and other visual elements as imminent collision threats.
Mitigation Analysis
Tesla should have implemented more rigorous testing protocols before deploying vision-only systems to production vehicles. A hybrid approach maintaining radar as a validation sensor could have prevented false positives. Real-time anomaly detection and immediate over-the-air update mechanisms could have reduced exposure time. More conservative braking algorithms during the transition period would have minimized phantom events.
Litigation Outcome
Multiple class action lawsuits filed against Tesla alleging defective Autopilot system, with cases ongoing as of 2023
Lessons Learned
The incident demonstrates the critical importance of maintaining redundant safety systems during technology transitions in safety-critical applications. Vision-only AI systems require extensive real-world validation before replacing proven sensor fusion approaches, particularly in scenarios involving life-safety decisions.
Sources
PE 22-004 Tesla Phantom Braking Investigation
National Highway Traffic Safety Administration · Feb 16, 2022 · regulatory action
U.S. opens Tesla probe over 'phantom braking' complaints
Reuters · Feb 17, 2022 · news
Tesla's 'phantom braking' problem
Washington Post · May 10, 2022 · news
NHTSA closes Tesla phantom braking investigation
TechCrunch · Jun 22, 2023 · news