← Back to incidents
Waymo Autonomous Vehicle Rear-Ended by Human Driver After Hitting Cyclist in San Francisco
MediumA Waymo autonomous vehicle struck a cyclist in San Francisco in February 2023, causing minor injuries. The incident highlighted ongoing challenges in autonomous vehicle detection of vulnerable road users.
Category
Safety Failure
Industry
Technology
Status
Resolved
Date Occurred
Feb 7, 2023
Date Reported
Feb 8, 2023
Jurisdiction
US
AI Provider
Other/Unknown
Application Type
agent
Harm Type
physical
People Affected
1
Human Review in Place
No
Litigation Filed
No
Regulatory Body
California Department of Motor Vehicles
autonomous_vehiclewaymocyclistsan_franciscocollisionsafetycalifornia_dmv
Full Description
On February 7, 2023, a Waymo autonomous vehicle struck a cyclist who had fallen in the roadway in San Francisco's Richmond District, resulting in minor injuries to the cyclist. The incident occurred while the Waymo vehicle was operating in fully autonomous mode without a human safety driver present. According to reports filed with the San Francisco Police Department and subsequently disclosed to the California Department of Motor Vehicles, the collision happened when the autonomous system attempted to navigate around the fallen cyclist but failed to execute a completely safe avoidance maneuver. Following the initial collision, the Waymo vehicle was rear-ended by a human-driven vehicle, creating a secondary impact scenario.
The technical failure involved Waymo's autonomous driving system's perception and path-planning capabilities when encountering an unexpected obstacle in the roadway. While the system successfully detected the presence of the fallen cyclist, the decision-making algorithms governing the vehicle's response proved inadequate for the complex scenario. The autonomous vehicle made contact with the cyclist during its attempted avoidance maneuver, indicating potential limitations in the system's ability to process real-time environmental data and execute safe navigation around vulnerable road users in dynamic urban conditions. The secondary collision occurred when the Waymo vehicle stopped or significantly reduced speed following the initial incident, and the following human driver failed to maintain adequate following distance.
The cyclist sustained non-life-threatening injuries requiring emergency medical treatment at the scene. The incident resulted in property damage to both the Waymo vehicle and the human-driven vehicle involved in the secondary collision. Beyond immediate physical and property impacts, the incident contributed to ongoing public and regulatory scrutiny regarding the safety of autonomous vehicle testing in dense urban environments. The California Department of Motor Vehicles initiated a review of the incident as part of its regulatory oversight responsibilities for autonomous vehicle testing programs operating within the state.
Waymo immediately reported the incident to California regulators in compliance with state autonomous vehicle testing regulations and cooperated fully with the San Francisco Police Department's investigation. The company conducted an internal review of the incident to analyze the autonomous system's performance and identify potential improvements to its detection and response algorithms. Waymo issued public statements emphasizing its commitment to safety and continuous improvement of its autonomous driving technology, while noting that the human driver who caused the secondary collision was cited by police for following too closely.
This incident highlighted persistent challenges facing autonomous vehicle developers in creating systems capable of safely navigating complex interactions with vulnerable road users such as cyclists and pedestrians. The collision occurred during a period of increased autonomous vehicle testing activity in San Francisco, where multiple companies were conducting road tests of self-driving technology. Industry experts noted that scenarios involving fallen or disabled road users represent particularly challenging edge cases for autonomous systems, requiring sophisticated sensor fusion and decision-making capabilities.
The incident contributed to broader regulatory discussions about autonomous vehicle safety standards and testing protocols in California. It underscored the ongoing technical challenges in developing autonomous systems capable of handling the full spectrum of real-world driving scenarios, particularly those involving unexpected obstacles and vulnerable road users in urban environments. The case became part of the regulatory record used by California authorities to evaluate the readiness of autonomous vehicle technology for broader deployment beyond controlled testing environments.
Root Cause
The Waymo vehicle's autonomous driving system failed to properly navigate around a cyclist who had fallen in the roadway, resulting in contact that caused minor injuries to the cyclist.
Mitigation Analysis
Enhanced sensor fusion and object detection algorithms could improve recognition of fallen cyclists and pedestrians in roadways. Real-time human oversight capabilities and more conservative stopping distances in urban environments with vulnerable road users could reduce collision risk. Improved communication systems between autonomous vehicles and emergency responders could expedite response times.
Lessons Learned
The incident demonstrates that autonomous vehicle systems still face significant challenges in complex urban scenarios involving vulnerable road users. It underscores the need for more sophisticated sensor systems and decision-making algorithms that can better handle edge cases involving fallen cyclists or pedestrians.
Sources
Waymo self-driving car hits cyclist in San Francisco
SF Gate · Feb 8, 2023 · news
Waymo self-driving car strikes cyclist in San Francisco
TechCrunch · Feb 8, 2023 · news