As more automakers move toward partially autonomous driving systems, the Insurance Institute for Highway Safety (IIHS) announced Thursday it will push the industry for adequate safeguards to make certain that the drivers of these vehicles are still paying attention to the road.
IIHS announced that it is developing a new ratings program to evaluate these safeguards, and expects to issue its first set of ratings in 2022. It also warned that no OEM yet meets all of its pending criteria.
“Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” IIHS President David Harkey said in a statement. “In fact, the opposite may be the case if systems lack adequate safeguards.”
The new ratings program is based on “anecdotal observation of how some users are intentionally misusing these systems, and several high-profile crashes that have been linked to driver inattention,” Joe Young, director of media relations for IIHS, told Repairer Driven News.
The safeguards will be rated good, acceptable, marginal or poor. In general, to earn a good rating, IIHS said, systems will need to make certain that the driver’s eyes are on the road and their hands either on the wheel or ready to grab it at all times.
Semi-autonomous systems will also have to employ an escalating series of alerts, and be prepared to take appropriate emergency procedures, including bringing the vehicle to a stop on the side of the road, if the driver does not react.
‘Misleading messaging’ on capabilities
The institute noted that, in spite of what it called “misleading messaging” from some OEMs, truly autonomous vehicles are not yet available to American consumers. Instead, the systems on the market offer “partial automation,” able to assist the driver by steering, accelerating and braking on their own, but never fully in control of the vehicle.
To date, most systems, like Tesla’s Autopilot, Mercedes-Benz’s Driving Assistance Package and General Motors’ Super Cruise and upcoming Ultra Cruise, have been semi-autonomous, requiring the driver to be watching the road at all times and ready to take the wheel or hit the brakes.
GM’s Super Cruise, for instance, employs a “Driver Attention Camera” and a display directly in the driver’s line of vision that’s meant to help drivers stay focused on the road.
Advanced Driver Assistance System (ADAS) capabilities, employing radar, cameras and lidar for lane-departure warning and lane keeping, adaptive cruise control (ACC) and automatic emergency braking (AEB), have been presented as both safety features and necessary components for autonomous driving.
Volvo has announced that its new Ride Pilot system, to be included with the OEM’s EV flagship to be launched in 2022, will at least approach Level 3 capability, in which the car, rather than the human, is doing the driving. “The name ‘Ride Pilot’ implies what the driver can expect: when the car is driving on its own, Volvo Cars takes responsibility for the driving, offering the driver comfort and peace of mind,” the OEM has said.
Ride Pilot will not be made available for use, Volvo said, until it has received all necessary regulatory approval and passed the OEM’s safety tests. It has said that the feature will be launched first in California.
Misuse by drivers
IIHS said it is encouraging the adoption of systems to prevent both the intentional and unintentional misuse of self-driving capabilities. Existing technology can monitor a person’s gaze, head posture or hand position to ensure they are consistent with someone who is actively engaged in driving, it said.
Other safety advocates have also begun paying attention to the need for driver monitoring, IIHS said. For instance, it noted that Consumer Reports has announced it will begin awarding points for partially automated driving systems, but only if they have adequate driver monitoring systems, and will factor in IIHS safeguard ratings once they become available.
IIHS said it cannot provide precise timing for its testing program, because ongoing supply chain issues have made it harder to obtain vehicles for testing.
The Institute took some OEMs to task for “over[selling] the capabilities of their systems, prompting drivers to treat the systems as if they can drive the car on their own.” In some extreme cases, the Institute noted, drivers have been seen watching videos, playing games on their cellphones or even taking naps while speeding down the highway.
It cited a high-profile example from 2018, in which a Tesla Model X driver was killed when the Autopilot-engaged vehicle accelerated into a collapsed safety barrier. The National Transportation Safety Board found that the driver was most likely distracted by a cellphone video game at the time.
Unintentional misuse is also an issue, said IIHS Research Scientist Alexandra Mueller, who is in charge of the new ratings program.
“The way many of these systems operate gives people the impression that they’re capable of doing more than they really are,” Mueller said in a statement. “But even when drivers understand the limitations of partial automation, their minds can still wander. As humans, it’s harder for us to remain vigilant when we’re watching and waiting for a problem to occur than it is when we’re doing all the driving ourselves.”
In October, Los Angeles County prosecutors filed two counts of vehicular manslaughter against the driver of a Tesla on Autopilot that ran a red light and struck another car, killing its two occupants in 2019. It is believed to be the first time a driver has been charged with a felony for a fatal crash involving a semi-autonomous driving system.
Drivers taking risks
In a recent interview about the top 5 trends impacting P&C insurance in 2022, CCC industry analyst Susanna Gotsch told Repairer Driven News that early data has shown that self-driving technology may be encouraging drivers to take on more risk.
“There needs to be more work around the human machine interface,” Gotsch said. She said some drivers may not be aware that, even when the self-driving systems are operating, they still need to be ready to take over at a moment’s notice.
“That appears to be the biggest challenge right now” she said. “The liability always lies with the driver, because the technology is structured in a way that says you, as the driver, are always in control, you are responsible for the operation of this vehicle. Even if you’re using the system, you have to be able to take over from that technology very quickly, in in a matter of seconds.”
Gotsch noted one recent example of the technology’s limitations, when a vehicle with ACC and lane-keeping assist was being tested in Virginia. When the lane markings disappeared, “the car started to follow it off the road,” she said. “The driver in that particular case was able to take over very quickly because they were in a test mode, but your average driver might not be able to do so.”
IIHS said its ratings would not address how well individual ADAS systems work in identifying how well their cameras or radar sensors identify obstacles, which it noted are also factors that could contribute to crashes.
IIHS ratings criteria
In its announcement, the Institute sketched out its ratings criteria. To earn a good rating, a system will have to use multiple alerts to remind the driver to return their eyes to the road and their hands to the wheel, after they’ve neglected these duties for “too long.”
“Evidence shows that the more types of alerts a driver receives, the more likely they will notice them and respond. These alerts must begin and escalate quickly,” IIHS said. They could include chimes, pulsing the brakes, tugging on the driver’s seat belt or vibrations.
If the driver doesn’t respond to the alerts, the system should slow the vehicle to a crawl, or a full stop, and send a signal to a manufacturer “concierge” who can call emergency services.
“Once this escalation occurs, the driver should be locked out of the system for the remainder of the drive, until the engine is switched off and started again,” IIHS said.
Requirements for lane-keeping and lane-changing technologies are also included. “All automated lane changes should be initiated or confirmed by the driver, for instance. When traffic ahead causes ACC to bring the vehicle to a complete stop, it should not automatically resume if the driver is not looking at the road or the vehicle has been stopped for too long. And the lane centering feature should encourage the driver to share in the steering rather than switching off automatically whenever the driver adjusts the wheel, which effectively discourages them from participating in the driving,” IIHS said.
Finally, the Institute said, system should not let the driver use partial automation features when their seat belt is unfastened, or when AEB or lane-departure prevention is disabled.
The NTSB, which has investigated several crashes of production and prototype vehicles equipped with automated driving systems, has previously recommended that federal agencies “develop a performance standard for driver monitoring systems and mandate their implementation.”
IIHS had signaled its concerns in June 2020, when Harkey warned that driver-assist technologies “have the potential to create new risks.”
IIHS: “IIHS creates safeguard ratings for partial automation,” Jan. 20, 2022
IIHS: “New studies highlight driver confusion about automated systems,” June 20, 2019
NTSB: “Tesla crash investigation yields 9 NTSB safety recommendations,” Feb. 25, 2020
Lead image: A Tesla equipped with Autopilot. (Provided by IIHS)
A Cadillac equipped with GM’s Super Cruise driver assistance package. (Provided by IIHS)
A representation of an ADAS-equipped vehicle on a highway. (Provided by NHTSA)