Repairer Driven News
« Back « PREV Article  |  NEXT Article »

Researchers predict more autonomous vehicles by 2030, better ADAS tech needed

By on
Share This:

ABI Research has reviewed the current state of autonomous driving technology within the automotive industry in conjunction with artificial intelligence (AI), high-performance computing, mapping, and location intelligence to find that, by 2030, the majority of vehicles on the road will offer SAE Level 2+ or higher features.

ABI’s latest research paper, “A Scalable Approach to ADAS and Autonomous Driving,” states that 69.3% will be so equipped.

“Different autonomous applications vary in features and the level of driver involvement,” said James Hodgson, ABI Research smart mobility and automotive research director and author of the whitepaper, in a news release. “Some demand constant supervision, while others permit manual, visual, or cognitive disengagement.

“Active safety systems offer limited support, keeping the driver fully in control. In contrast, driverless vehicles eliminate the need for human operators by handling all driving tasks autonomously. Therefore, the automotive industry should adopt a scalable approach to their active safety, semi-autonomous, and fully driverless applications. Maximizing the re-use of components between different feature/disengagement combinations will yield many benefits to the market.”

The whitepaper provides an overview of each SAE Level and explores the technology implications of ADAS and active safety, 360-degree perception, high-performance compute, and the redundancy in perception, processing, and software. It also discusses the core role of safety rating agencies in making cars safer and driving the adoption of active safety.

To consumers, autonomous vehicle (AV) features and driver supervision combinations seem radically different in terms of their value, cost, and overall impact on their personal mobility experience, Hodgson wrote.

“However, from an architecture perspective, these applications share a common set of enabling technologies, with additional components added to enable more features and greater redundancy in the more comprehensive autonomous vehicle implementations.

“Therefore, the automotive industry should adopt a scalable approach to their active safety, semi-autonomous, and fully driverless applications.”

Maximizing the re-use of components between different feature and disengagement combinations will result in the following benefits for the market, he added:

    • Cost reduction via “a common set of enabling technologies powering active safety, supervised autonomous driving, and unsupervised autonomous driving;” and
    • A ramped-up experience for consumers by deploying common components into advanced driver assistance system (ADAS) technologies, supervised autonomous driving, and unsupervised autonomous driving to form an understanding of and prep for future unsupervised autonomous driving.

The most widely adopted ADAS approach by automakers is camera and radar sensor fusion for a more robust perception, according to Hodgson’s research, but he noted that camera sensors struggle in extreme lighting and weather conditions.

“In contrast, radar sensors have relatively poor resolution, but continue to perform in the same circumstances that compromise camera performance,” he wrote. “Radar sensors also deliver useful inputs such as range and relative velocity.”

Alternatively, the use of stereovision to deliver high-performance ADAS, such as Subaru’s EyeSight, builds a 3D model of the environment around the vehicle, much like human vision can determine depth and range, according to Hodgson.

Increasingly, OEMs are bringing Level 2 systems to market that provide compute platforms and software originally thought of for Level
3 and Level 4 systems, Hodgson noted, such as automatic lane change, exiting off highways, and target speed on highways with or without any human input, as well as hands-free city driving.

Higher levels of unsupervised automated driving include lidar, imaging/HD radar, and duplicate AV systems-on-a-chip (SoCs), according to Hodgson’s research.

Over time, some additional sensor technologies, particularly imaging/HD radar, are expected to be incorporated into Level 2+ systems to further improve their safety. Unlike active safety systems, which tend to be shaped by safety rating agency testing protocols, the success of Level 2+ systems will depend on their real-world performance, creating an opportunity for imaging/HD radar in the future.

“Overall, a Level 2+ strategy takes advantage of the relatively lower costs, lower risk, and broader regulatory accommodation of supervised automation to kick-start the autonomous vehicle revolution.”

Hodgson added that while Level 4 vehicles have more technologies, they aren’t expected to be available for consumers to buy in the short- or medium-term.

“[F]ully driverless vehicles will be deployed in a robotaxi context, with fleet operators employing as few vehicles as possible to fulfill the mobility demand,” he wrote. “Driverless vehicle deployments in support of people transit on public roads are still highly limited, and expected to remain so until legislation evolves to accommodate the introduction of driverless vehicles at scale.”

Robotaxi company, Cruise, has been under fire since 2022 when the National Highway Traffic Safety Administration (NHTSA) launched an investigation of

Several incidents occurred last year in which Cruise robotaxis allegedly caused collisions and nearly missed hitting pedestrians.

One pedestrian-involved incident that occurred in August and was recently reported on by the NBC Bay Area Investigative Unit involved a 7-year-old boy. He was nearly hit in a crosswalk, according to California DMV records obtained by NBC.

The boy’s father, Sascha Retailleau, told NBC he was walking with his wife and son in their Mission District neighborhood when they crossed 20th Street near the intersection of York Street on the evening of Aug. 14.

“[The car] was fully stopped, and then it started when he had gotten maybe a third of the way or halfway across the intersection,” he said, according to the NBC article. “It started to accelerate towards us like we weren’t there.”

Retailleau added the AV swerved as it approached toward his son, who had to rush ahead to avoid being hit. Retailleau later reported the near miss to the DMV, which told NBC it couldn’t disclose any details from its ongoing investigation, according to the article.

A Cruise vehicle also allegedly nearly hit two women and two children in a crosswalk in the San Francisco Pacific Heights neighborhood.

NBC said, “Cruise acknowledged its vehicle was involved, but declined to comment further, citing an ongoing federal investigation looking into the incident.

“As for the similar near-miss reported by Retailleau, Cruise said its records show none of its driverless cars traveled through the specific intersection around the time Retailleau and his family were crossing the street.”


Hodgson and ABI have found that despite its wide use in automotive ADAS and “robust performance” in poor lighting and weather, conventional configurations of radar use have several weaknesses including poor resolution and pedestrian detection as well as false detection of objects or pedestrians adjacent to vehicles.

The solution, Hodgson wrote, is new-generation radar transceivers that produce higher data volumes and significantly increase resolution.

Another option is the use of lidar to complement sensor sets rather than camera-only strategies, which is increasingly being done in Greater China EV markets, according to ABI.

“Overall, the only feasible approach to delivering on feature-rich and unsupervised automation is to construct today’s supervised autonomous applications on an architecture that has the potential to scale by adding technologies that will replace the supervisory role that human drivers play today,” Hodgson said.

Lidar is also a solution, he wrote.

“Overall, lidar delivers a blend of range detection, velocity measurement, and relatively robust performance in poor lighting and weather conditions, while also delivering the necessary resolution for object classification, semantic segmentation, and free space detection,” Hodgson wrote. “Therefore, lidar is regarded as an essential tertiary sensor modality or ‘third opinion’ in unsupervised automation.”

Safety is an often-deliberated topic when it comes to ADAS, EVs, and AVs as the automotive, insurance, and aftermarket industries as well as consumers and regulators ease into emerging technologies.

In a Feb. 2 article on the subject of AV safety, Wards Auto noted another Cruise incident that occurred in October 2023 in which a robotaxi drug a pedestrian, who was hit by another human-driven vehicle, 20 feet down a street in San Francisco.

“Accidents like this, and the well-publicized Tesla Autopilot crashes, undermine public confidence in connected and autonomous vehicles,” the article states. “Allegations, whether true or not, of covering up how an accident occurred won’t help the connected autonomous vehicle (CAV) cause.”

Greg Brannon, American Automobile Association (AAA) automotive engineering director of research, told Wards, “People are interested in advanced driver assistance systems (ADAS) for their vehicles but less interested in fully autonomous technologies. Consumers are fearful of the unknown, and with the number of high-profile crashes that have occurred from overreliance on current vehicle technologies, this isn’t entirely surprising.”

To overcome consumer CAV safety concerns, AAA told Wards it wants to partner with automakers for greater consistency in vehicle and public safety.

“Automakers and mobility providers that push autonomous driving technology without addressing what people want will fail,” the Wards article states. “…So, rather than pushing, automakers need to pull them toward adopting autonomous driving tech.


Featured image: Stock illustration of ADAS and AV technology. (Credit: gremlin/iStock)

Share This: