Human Error Causes 99% of Autonomous Vehicle Accidents: StudyHuman Error Causes 99% of Autonomous Vehicle Accidents: Study
October 20, 2021
Of 187 reports of AV accidents, just two could be attributed to poor systems performance.
A whopping 99 percent of autonomous vehicles accidents were caused by human error, a new report from IDTechEx shows.
Data from Autonomous Cars, Robotaxis & Sensors 2022–2042 reveals for crashed vehicles that were operating in autonomous mode, 81 out of the 83 recorded incidents were caused by a human, either in another vehicle or as a misbehaving pedestrian.
Of 187 reports of autonomous vehicles accidents, just two could be attributed to the poor performance of the systems.
AV drivers ‘don’t mistake forward and reverse gears’
Every 24 seconds someone is killed on a road, according to road safety charity Brake.
Around 90 percent of motor vehicle crashes are caused at least in part by human error.
“We are fallible, distracted, and potentially dangerous,” the report suggests.
“Our liability behind the wheel is not just the problem of other human drivers but can also be difficult for autonomous vehicles to contend with.”
“Autonomous drivers don’t mistake forward and reverse gears, they don’t take unnecessary risks such as overtakes and running stop signs, they always pay 100 percent attention, and finally, they won’t flee from the police.”
The report suggests that some of the accidents could stem from assumed behavior – where the human driver expects the autonomous driver to go and they start moving in anticipation, only to hit the still stationary autonomous vehicle.
“Like being caught out by a learner driver not moving at a roundabout or other junction; sometimes there looks to be a sufficient gap and the driver begins to move, expecting that the learner has gone for it, but the overly cautious learner is still there.”
Among other data, IDTechEx researchers studied disengagement reports from Waymo. They found that around 20 percent of disengagements were caused by the test driver reacting to a nearby human driver’s poor behavior.
“This could be failing to abide by the rules and etiquette of the road, or other drivers behaving aggressively towards the autonomous vehicle,” the research team suggested.
Similar, somewhat bizarre, incidents were recorded when pedestrians attacked two Cruise vehicles with wooden sticks in a completely unprovoked display of aggression.
The report highlighted four notable crash cases – two involving Cruise vehicles, one from Pony.ai, and one from Zoox.
Pony.ai’s incident occurred in July 2019. The test driver backed the vehicle at a traffic light-controlled junction after noticing the car in front had its reverse lights on. When the light turned green, instead of going forward, the car in front accelerated in reverse – eventually hitting the Pony.ai vehicle.
Cruise’s incident occurred in March this year. While operating in autonomous mode, the vehicle turned left at a four-way junction when a vehicle behind attempted to overtake and carry straight on. The vehicle collided with the front left of the autonomous Cruise.
The second Cruise incident occurred just a month earlier. Again operating in autonomous mode, the vehicle turned right at a junction and had right of way. A vehicle approaching from the left failed to stop for a stop sign and collided with the autonomous car.
Zoox’s mishap occurred in November 2019. The AV was being driven by the human test driver in manual mode. The driver entered a junction under a green light and was hit by another vehicle illegally entering the junction while fleeing from local police.
The majority of the 81 incidents where a human driver collided with an autonomously driven vehicle were typical crashes. The most common form of collision was simply being rear-ended while in traffic or stopped.
“These kinds of crashes are likely caused by either human inattention or distraction,” IDTechEx’s researchers suggested.
Uh oh, Zoox in trouble
What of the two cases where the autonomous vehicle was at fault?
Those instances involved vehicles made by Zoox, a startup acquired by Amazon in 2020. In both cases, the car appeared to misjudge its clearance to parked vehicles it was attempting to navigate around.
Both cars made contact, causing some minor damage.
IDTechEx’s researchers suggested possible weakness or a blind spot in the sensor suite, or a fault of the autonomous driver.
“Either way, as R&D activities are still underway, autonomous vehicles are not yet 100 percent infallible, and likely never will be.”
Runaway Waymo car and Uber AV death
Unlike humans, AVs enjoy permanent 360° perception and can communicate their intentions to each other in advance.
But that doesn’t mean they’re perfect.
A Waymo car appeared to be confused back in May and opted to block traffic. When handlers were sent to recover the rogue ride, the vehicle evaded its masters and drove away.
To further exacerbate the situation, the remote handler failed to control the vehicle as it sporadically lurched away from those sent to retrieve it.
But it is Uber that is responsible for the unfortunate record of the world’s first death by a self-driving vehicle.
In November 2018, one of its autonomous vehicles struck and killed a pedestrian after the car’s safety driver was distracted by streaming television.
Uber has since opted to mostly abandon its self-driving car development plans, selling its autonomous division to Aurora.
And it’s not just the drivers that are problematic, but the roads themselves, at least, according to a recent United Nations initiative.
The AI for Road Safety project, launched in early October, hopes to encourage states to improve their road safety by applying AI in areas like post-crash response and road infrastructure.
– Originally published in IoT World Today’s sister publication AI Business.
About the Author(s)
You May Also Like