Repairer Driven News
« Back « PREV Article  |  NEXT Article »

Research: People more critical if AI culprit in crashes of ‘semi-automated’ vehicles

By on
Legal | Market Trends | Technology
Share This:

New research has found people tend to blame and judge a manufacturer disproportionally compared to a supervising human when a Level 2 advanced driver assistance system crashes.

The study published in a January issue of the Risk Analysis journal and featured in a March PropertyCasualty360 article discussed what authors Peng Liu and Yong Du called “blame attribution asymmetry.”

This involves “a tendency that people will judge the automation-caused crash more harshly, ascribe more blame and responsibility to automation and its creators, and think the victim in this crash should be compensated more,” Liu and Du wrote in the January Risk Analysis paper.

Liu and Du studied people’s responses to two scenarios involving what appeared to be the SAE Level 2 ADAS available from automakers today.

Level 2 systems, such as Tesla’s AutoPilot, Nissan’s ProPilot and GM’s Super Cruise, let the vehicle drive itself but demand the driver supervise at all times and be ready to take over when the vehicle requests it. It looks like autonomy, but it isn’t — for the driver always remains responsible, even in situations where they’re permitted to take their hands off the wheel. It’s not until SAE Level 3 that the machine is truly thought of as the driver — though the human must still be ready to take over if the vehicle encounters conditions it can’t handle.

“In the eyes of lay people, autonomous agents can be blamed or held responsible for unexpected outcomes,” Liu and Du wrote. “As the final consumers of AVs, their opinions and preferences in these issues should be taken seriously. Given the growing presence of semi-AVs on public roads, the study of how people judge traffic accidents caused by semi-AVs and subsequently attribute blame and responsibility is of great societal importance. Our research aimed to elucidate how people judge traffic accidents caused by human and automation that result in equivalent consequences in semi-AVs and whether and why people ascribe different blame and responsibility to them.”

Liu and Du studied hypothetical scenarios where a “semi-AV” crashed because of either the fault of the vehicle or the driver. However, under current Level 2 setups, and arguably under the definition of “semi-automated” Liu and Du gave test subjects, the human is supposed to be supervising at all times. (“The driver should continuously monitor the vehicle and roads.”)

In the first study, one group of test subjects were told about a “semi-automated vehicle” which makes an error and gets in a crash — while also failing to make a “take-over request.” Another group was told about an incident where a “sudden event” arises and the system issues the takeover request. “However, owing to errors made by the human driver, the driver did not successfully take over the vehicle,” the experiment told the participants.

In both hypothetical instances, a passenger was injured.

A further batch of subjects were presented with the same two scenarios, except they were told the passenger died.

“In Study 1, a hypothetical automation-caused crash was judged to be more severe and less acceptable than an equivalent hypothetical human-caused crash, regardless of crash severity (injury or fatality),” the authors wrote.

The study participants also were more likely to blame the vehicle when the human was actually the culprit than vice versa — even though the experiment explicitly told them who was at fault.

In a second study, test subjects heard about a situation where a forward collision warning failed to activate while a human was in control of a semi-automated vehicle.

“If the driver had been able to step on the brake and avoid the car ahead immediately, then the driver could have avoided an accident,” the experiment told them. “However, this driver was inattentive at that time and did not brake and avoid the car ahead in a timely manner.”

A second group of test subjects were told about a similar setup, except the semi-automated vehicle was controlling itself. “If the automated driving system had been able to engage the brake and avoid the car ahead immediately, it could have avoided an accident,” the experiment stated. “However, the automated driving system malfunctioned at that time and did not engage the brake and avoid the car ahead nor send a take-over request in a timely manner.”

In both cases, the passenger being transported died.

“In Study 2, the participants ascribed more blame and responsibility to automation and its creators than the human driver who caused an equivalent crash,” Liu and Du wrote. They also found evidence that test subjects felt “a victim in an automation-caused crash should be awarded more compensation than that in a human-caused one.”

The results are interesting in that Liu and Du also arguably told all the participants indirectly that the human was really the culprit, even though they also presented the automation as responsible for the crash at various times in the study. As noted above, the study’s definition of the “semi-automated” vehicle declared, “The driver should continuously monitor the vehicle and roads.”

This human responsibility would also be the case in the real world — a point the researchers make as well.

“Arguably, semi-AVs are not fully AVs and their autonomy is limited; meanwhile, a human driver in a semi-AV is supervising the driving and would take over control of the vehicle when requested, and thus, the human should be regarded as the responsible party for unexpected outcomes during the drive,” they wrote. “However, subjective autonomy is more important than objective autonomy, and autonomy and moral responsibility are more matters of perception. Our participants tended to attribute responsibility to automation and blame it and its creators more when it caused a crash.”

Where would a body shop fit in?

Liu and Du’s research and the concept of “blame attribution asymmetry” is interesting when one considers the potential for a body shop’s work to affect the performance of a Level 2 vehicle as the technology spreads throughout the fleet. Will the OEM still be disproportionally blamed when a human third party — the shop — bears culpability? Will the body shop? When judges, juries and liability insurers attempt to apportion blame and compensation, this is going to be an important question.

Liu and Du’s two test scenarios didn’t study situations introducing this additional variable, and Liu didn’t have a hypothesis of what would happen in such a case.

“There is one possibility that a third human party (e.g., Repairer) could be blamed for the unwanted outcome,” Liu wrote in a March 11 email when contacted by Repairer Driven News about the topic. “I do not have any data or sense about it. But I do think it is an (important) topic that is completely ignored in current research about blame and responsibility attribution related to automated vehicles.”

Liu and Du’s work on “blame attribution asymmetry” harming the AI’s manufacturer is also interesting in light of another collision industry issue: Vehicle owners blame the automaker and switch brands when they get a bad auto body repair, studies have found.

In 2015, then-Ford collision marketing manager Mark Mandl and FCA collision marketing manager Erica Schaefer cited data holding that a bad repair at a body shop could cost an auto brand a customer’s loyalty.

FCA has found that 60 percent of customers which received an incomplete or inadequate repair sold or traded that car within a year, Sean Carey of SCG Management Consultants said in 2016. The OEM found 63 percent of those dissatisfied customers switch brands, he said. Ford global collision marketing Rob Johnston told a 2019 VeriFacts Guild 21 call that 27 percent of customers who got rid of their vehicles within a year and a half of an accident did so because of damage from the repair. Another 21 percent did so because of the quality of the repair, he said. He said “much deeper” research indicates those customers ditching their vehicles don’t necessarily buy Fords.

Toyota in April 2020 reported research with IHS Markit found that collision work represented the No. 1 brand loyalty driver. Toyota national manager of service and collision operations George Irving Jr. said this  effect “far exceeded” a service repair order, which also had a positive result on brand loyalty, he said. However, a warranty repair order “doesn’t work” for brand loyalty, he said.

Combine this OEM research on brand retention with Liu and Du’s work on backlash over the crash itself, and OEMs might have an even greater incentive to promote and manage certified facilities.

More information:

“Blame Attribution Asymmetry in Human–Automation Cooperation”

Peng Liu, Yong Du in Risk Analysis, Jan. 13, 2021

“The road ahead for semi-autonomous vehicle crash liability”

Peng Liu in PropertyCasualty360, March 10, 2021

SAE J3016 standard on levels of autonomy

SAE, June 15, 2018

Images:

In a January 2021 Risk Analysis paper, researchers Peng Liu and Yong Du describe “blame attribution asymmetry” encountered among test subjects when evaluating crashes caused by an artificial intelligence and similar crashes caused by a human. (petovarga/iStock)

SAE Level 2 and Level 3 autonomy involve human-machine handoffs, and the human is always considered the driver in Level 2. (baona/iStock)

Various sensors are necessary for advanced driver assistance systems and autonomous vehicles to work properly. (NatalyaBurova/iStock)

Share This: