An Insurance Institute for Highway Safety study of more than 5,000 crashes found that only about 34 percent of collisions could be assumed to vanish if with a completely self-driving fleet.
The researchers also concluded riders might need to accept a more leisurely and cautious pace from their self-driving cars to eliminate the most frequent source of crashes.
The data offers hope for auto body shops that increased autonomy wouldn’t eradicate the majority of their business.
“Conventional thinking has it that self-driving vehicles could one day make crashes a thing of the past,” the IIHS wrote Thursday, noting the statistic that human error produces more than 90 percent of crashes.
But the IIHS said its researchers found crashes tied to human “sensing and perceiving errors” and “incapacitation” only accounted for 24 percent and 10 percent of collisions, respectively.
Incapacitation referred to situations like sleepy or intoxicated drivers. “Sensing and perceiving errors” included overlooking hazards, distracted driving and situations with reduced visibility.
Authors Alexandra Mueller, Jessica Cicchino and David Zuby argued that only accidents tied to those two causes could theoretically be assumed to disappear under self-driving cars. (Though the IIHS noted this “would require sensors that worked perfectly and systems that never malfunctioned.”)
“It is reasonable to expect, although not certain, that AVs will do a better job of perceiving the environment than humans; however, AVs will still need to be programmed to act safely in response to what they perceive,” they wrote. “Likewise, it is also reasonable to assume that, as self-driving vehicles, AVs will not be vulnerable to incapacitation or alcohol impairment issues. Accordingly, we assumed that AVs would prevent crashes that had sensing and perceiving factors only, as well as those with incapacitation factors. Crashes with sensing and perceiving factors were only considered to be addressed by AVs if they did not have other identified driver-related factors, as these additional factors could have contributed to the crash even if the AV had flawless perception. Crashes with incapacitated drivers were considered to be preventable by AVs regardless of the presence of any other driver-related factors, as it was assumed that incapacitation undermines all of the driving roles the operator must perform to safely navigate the road, and the object of the analysis was to determine the percent of crashes that could remain beyond these implicit assumptions.”
Other crash scenarios could remain under self-driving cars, according to the IIHS:
• “Predicting” errors occurred when drivers misjudged a gap in traffic, incorrectly estimated how fast another vehicle was going or made an incorrect assumption about what another road user was going to do.
• “Planning and deciding” errors included driving too fast or too slow for the road conditions, driving aggressively or leaving too little following distance from the vehicle ahead.
• “Execution and performance” errors included inadequate or incorrect evasive maneuvers, overcompensation and other mistakes in controlling the vehicle. (Minor formatting edits.)
“It’s likely that fully self-driving cars will eventually identify hazards better than people, but we found that this alone would not prevent the bulk of crashes,” IIHS research Vice President Jessica Cicchino said in a statement.
39 percent of crashes were tied to intentional “(p)lanning and deciding” behaviors the IIHS said riders might demand in self-driving cars as well. (To put it another way: How irritated do you get when someone dares to drive at or below the speed limit instead of the typical 5-10 mph over?) The researchers noted that the planning and deciding crashes “often involved speeding (23%) or illegal maneuvers (15%).”
“The fact that deliberate decisions made by drivers can lead to crashes indicates that rider preferences might sometimes conflict with the safety priorities of autonomous vehicles,” the IIHS wrote. “For self-driving vehicles to live up to their promise of eliminating most crashes, they will have to be designed to focus on safety rather than rider preference when those two are at odds.
“Self-driving vehicles will need not only to obey traffic laws but also to adapt to road conditions and implement driving strategies that account for uncertainty about what other road users will do, such as driving more slowly than a human driver would in areas with high pedestrian traffic or in low-visibility conditions.”
The researchers linked 17 percent of collisions to “predicting” mistakes and 23 percent to “execution and performance” issues and suggested Uber’s self-driving XC90 that struck and killed Elaine Herzberg demonstrated both errors.
Despite the IIHS’ findings, body shops might still have cause for concern:
• Losing 34 percent of your volume still would constitute a lousy business development.
• The researchers’ data set involved severe crashes between 2005-07: collisions which saw emergency medical services dispatched and at least one vehicle towed away. Milder crashes — a significant source of volume for repairers too — might be more easily mitigated by an AI, even one behaving like a human.
“Speed in particular is a key contributing factor to crashes today, as it makes a crash more likely by decreasing the time available to react,” the IIHS authors argued. “While AVs may be able to detect and thus react to hazards more quickly than human drivers, they will not be able to respond instantaneously.”
But a real-world crash in which the driver did 25 in a 20 mph zone is probably more preventable in an all-AI world than a real-world crash where the human did 40 in a 35 mph zone.
• The Self-Driving Coalition for Safer Streets, which includes companies like Google/Waymo, Uber and Ford, argued Thursday that the IIHS results actually proved a far larger reduction in crashes would be possible.
“Last year, an estimated 36,120 people died in motor vehicle crashes on American roads across the country with millions more injured,” coalition general counsel Ariel Wolf said in a statement. “The Self-Driving Coalition’s members are dedicated to fully self-driving technology as a transformative opportunity to significantly reduce those numbers. In fact, the IIHS study finds that self-driving vehicles could prevent 72% of crashes.
• Americans might not mind a slower, more cautious ride if the AI frees them to do whatever they want during the commute. Think about it: Would you rather get to work in 15 minutes, during which you could do nothing but pay attention to traffic? Or would you rather get to work in 20 minutes but be able to sleep, catch up on email or play on your phone the entire time? We’ve heard a Google autonomy expert make a similar point, and it’s a valid one.
• Forbes contributor Brad Templeton, who has experience with Google’s self-driving car project, offered an interesting criticism Tuesday.
“Their analysis is strange and quite flawed,” Templeton wrote in Forbes. “I think if you asked most self-driving car developers where the hard problems are, they would say that (perception) is the hard one, and (planning and deciding) and (execution and performance) are the easiest to get right.”
Insurance Institute for Highway Safety, June 4, 2020
Alexandra Mueller, Jessica Cicchino, David Zuby, Insurance Institute for Highway Safety, May 2020
Self-Driving Coalition, June 4, 2020
Brad Templeton in Forbes, June 9, 2020
A Nuro self-driving car is seen in Mountain View, Calif., on Sept. 17, 2019. (Andrei Stanescu/iStock)
A Waymo/Google autonomous vehicle drives in Mountain View, Calif., on July 10, 2018. (Andrei Stanescu/iStock)
The Aptiv-Lyft vehicle with autonomous technology drives on the strip Thursday, November 30, 2017 in Las Vegas, Nevada. (Photo by John F. Martin for Aptiv)