Repairer Driven News
« Back « PREV Article  |  NEXT Article »

Researchers find low-cost malicious attacks can affect AV operations, most U.S. drivers won’t ride driverless

By on
Market Trends | Technology
Share This:

Researchers at the University of California, Irvine say they’ve demonstrated that multicolored stickers applied to stop or speed limit signs can confuse self-driving vehicles and cause unpredictable, possibly hazardous, operations.

According to a UC Irvine press release, UC Irvine has discovered real-world implications of a previous theory that “low-cost and highly deployable malicious attacks can make traffic signs undetectable to artificial intelligence algorithms in some autonomous vehicles while making nonexistent signs appear out of nowhere to others.”

They found that both types of attacks can result in cars ignoring road commands, triggering unintended emergency braking, speeding, and other traffic violations.

In February, researchers from UC Irvine’s Donald Bren School of Information & Computer Sciences described their discovery in a recent presentation at the Network and Distributed System Security Symposium in San Diego.

The study included three artificial intelligence attack designs and was the first large-scale evaluation of traffic sign recognition systems in top-selling consumer vehicle brands, according to UC Irvine.

“Waymo has been delivering more than 150,000 autonomous rides per week, and there are millions of Autopilot-equipped Tesla vehicles on the road, which demonstrates that autonomous vehicle technology is becoming an integral part of daily life in America and around the world,” said co-author Alfred Chen, UC Irvine assistant professor of computer science, in the release. “This fact spotlights the importance of security since vulnerabilities in these systems, once exploited, can lead to safety hazards that become a matter of life and death.”

The lead author of the study, Ningfei Wang, who is a research scientist at Meta, performed his work as a doctorate student in computer science at UC Irvine. He said his team’s attack vectors of choice were stickers that had swirling, multicolored designs meant to confuse AI algorithms used for traffic sign recognition in driverless vehicles.

“These stickers can be cheaply and easily produced by anyone with access to an open-source programming language such as Python and image processing libraries,” Wang said, in the release. “Those tools combined with a computer with a graphics card and a color printer are all someone would need to foil TSR [traffic-sign recognition] systems in autonomous vehicles.”

He added that the research discovered spatial memorization design common to many of today’s commercial TSR systems, seemingly removes a sign from the vehicle’s view more difficult but also makes spoofing a fake stop sign “much easier than we expected.”

Chen said that by focusing on a small subset of existing research, his group uncovered various broken assumptions, inaccuracies, and false claims.

“We believe this work should only be the beginning, and we hope that it inspires more researchers in both academia and industry to systematically revisit the actual impacts and meaningfulness of such types of security threats against real-world autonomous vehicles,” Chen said, in the release. “This would be the necessary first step before we can actually know if, at the society level, action is needed to ensure safety on our streets and highways.”

The majority of Americans also still aren’t comfortable with riding in driverless vehicles.

According to AAA’s latest survey on autonomous vehicles, while the percentage of U.S. drivers who are comfortable with autonomous vehicles has risen from 9% last year to 13%, 6 in 10 are afraid.

For drivers, enhancing vehicle safety systems remains a priority over the development of self-driving, with interest among drivers decreasing from 18% in 2022 to 13% this year, AAA said.

Seventy-eight percent of respondents said they prioritize advancements in safety systems as a top vehicle technology initiative, according to an AAA press release about the survey results. Excitement surrounding new vehicle styles is a low priority, with 24% of drivers viewing it as important. Enthusiasm is also low toward the development of self-driving vehicles, with 13% of drivers considering this a priority, which is a decrease from 18% in 2022.

“Most drivers want automakers to focus on advanced safety technology,” said Greg Brannon, AAA’s automotive engineering director, in the release. “Though opinions on fully self-driving cars vary widely, it’s evident that today’s drivers value features that enhance their safety.”

AAA found that 74% of drivers were aware of robotaxis, yet 53% said they would not choose to ride in one, with different demographics having varying opinions. Millennial and Generation X drivers are more likely to than Baby Boomer drivers, but even among younger drivers, most say they wouldn’t ride in a robotaxi.

“Collectively, interest in advanced driver assistance features (ADAS) continues to remain high,” the release states. “The survey found that 64% of U.S. drivers would ‘definitely’ or ‘probably’ want Automatic Emergency Braking (AEB) on their next vehicle, 62% would want Reverse Automatic Emergency Braking, and 59% Lane Keeping Assistance. AAA believes that to maintain a growing interest in these features, the performance and naming of these systems must adequately reflect the intended benefits and capabilities of the systems.”

Images

Featured stock image credit: Scharfsinn86/iStock

Researchers in the Donald Bren School of Information & Computer Sciences at UC Irvine have demonstrated that traffic sign recognition systems in autonomously driven vehicles can be tricked into either seeing nonexistent roadside commands or not seeing actual ones, leading to aberrant and potentially dangerous driving behavior. (Ningfei Wang/UC Irvine)

Share This: