← Back to incidents
Tesla Recalls 2 Million Vehicles Over Autopilot Safety System Defects
CriticalTesla recalled 2.03 million vehicles in December 2023 after NHTSA found Autopilot's driver monitoring system was inadequate, allowing dangerous misuse that contributed to crashes and fatalities.
Category
Safety Failure
Industry
automotive
Status
Resolved
Date Occurred
Jan 1, 2016
Date Reported
Dec 13, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Model
Autopilot
Application Type
embedded
Harm Type
physical
Estimated Cost
$1,000,000,000
People Affected
2,031,220
Human Review in Place
No
Litigation Filed
No
Regulatory Body
National Highway Traffic Safety Administration (NHTSA)
autonomous_vehiclesdriver_assistanceautomotive_safetyregulatory_recallNHTSATeslaAutopilot
Full Description
In December 2023, Tesla issued a massive recall affecting 2.03 million vehicles equipped with Autopilot after a comprehensive two-year investigation by the National Highway Traffic Safety Administration (NHTSA). The recall encompassed nearly every Tesla vehicle sold in the United States from 2012 through 2023, including Model S, Model 3, Model X, and Model Y vehicles.
NHTSA's investigation, which began in August 2021, examined 956 crashes involving Tesla vehicles where Autopilot was suspected of being in use. The agency found that Tesla's Autopilot system contained fundamental safety defects that made it prone to misuse. Specifically, NHTSA determined that the system's driver monitoring controls were insufficient to prevent drivers from becoming over-reliant on the technology or using it in inappropriate driving conditions.
The investigation revealed that Autopilot could be activated on roads where it was not designed to operate safely, such as city streets with cross-traffic, pedestrians, and complex intersections. The system's driver attention monitoring relied primarily on detecting torque on the steering wheel rather than monitoring driver eye gaze or head position, allowing drivers to defeat the system by simply resting a hand on the wheel while not actually paying attention to the road.
NHTSA found evidence that some drivers were using Autopilot as a fully autonomous system despite Tesla's warnings that it required active driver supervision. The investigation documented cases where drivers engaged in activities like reading, sleeping, or using mobile devices while Autopilot was active. In several fatal crashes, investigators found that drivers had removed their hands from the steering wheel for extended periods before impact, suggesting over-reliance on the system.
The recall required Tesla to deploy an over-the-air software update to address the safety defects. The update enhanced the driver monitoring system to provide more prominent visual and audible alerts when driver attention is not detected, increased the sensitivity of the driver attention monitoring system, and added restrictions on where Autopilot can be activated. Tesla also agreed to implement additional safeguards including automatic disabling of Autopilot if drivers repeatedly fail to demonstrate attention.
The recall marked a significant regulatory action against autonomous vehicle technology and highlighted the challenges of deploying advanced driver assistance systems that operate in the gray area between manual driving and full autonomy. The incident underscored the critical importance of robust driver monitoring systems and appropriate operational design domains for semi-autonomous vehicle technologies.
Root Cause
Tesla's Autopilot system had inadequate driver monitoring controls that failed to ensure drivers maintained attention and control. The system allowed activation in conditions where it was not designed to operate safely, and provided insufficient feedback when drivers were not properly supervising the vehicle.
Mitigation Analysis
Enhanced driver monitoring systems with more sensitive attention detection, stricter operational domain restrictions preventing activation on unsuitable roads, and mandatory driver training programs could have prevented misuse. Real-time monitoring of driver engagement and automatic system disabling when attention lapses are detected would reduce over-reliance incidents.
Lessons Learned
The incident demonstrates that advanced driver assistance systems must include robust safeguards against misuse and clear operational boundaries. Regulatory oversight of semi-autonomous vehicle technologies requires ongoing monitoring beyond initial approval, particularly as real-world usage patterns emerge that may differ from intended design parameters.
Sources
Tesla Recalls Over 2 Million Vehicles to Update Autopilot Software
National Highway Traffic Safety Administration · Dec 13, 2023 · regulatory action
Tesla recalls more than 2 million vehicles in US over Autopilot safety concerns
Reuters · Dec 13, 2023 · news