Hyundai is getting closer to developing Level 3 autonomous driving technology, an executive said at a recent industry event, signally the automaker plans to join other OEMs in further advancing self-driving features.
“We’re very close, and we’re looking to make sure we do it correctly,” Brian Latouf, the OEM’s global chief safety officer, said during last week’s Automotive News Canada Congress.
Current Level 2 technology, sometimes called semi-autonomous, can drive the vehicle through adaptive cruise control and lane centering but requires an attentive driver who is always ready to take control.
With Level 3 technology, the vehicle takes responsibility for driving, on specific stretches of roadway and under certain conditions.
Latouf said Hyundai is looking to initially roll the next-level automation out in South Korea before introducing it to the North American market.
“In the Korean market, we’re looking at introducing a Level 3 that’s kind of a highway drive pilot type of system that is on just highways alone and limiting certain speeds,” he said, adding the technology is not yet in production.
Before introducing the technology to market, Latouf said Hyundai will work to enhance vehicle safety to ensure the autonomous features are ready to hit roadways responsibly.
“There’s a lot of testing that’s happening and the team is looking carefully at that,” he said. “We have a very structured process to look across our different data streams to say,: ‘Hey, are we having some power steering failures that could create lateral risk and perhaps crashes?’ and then we act upon it. So, it’s a very good, technically based data analytics investigation recall decision process.”
Hyundai isn’t the first to announce further advancements in vehicle automation. Mercedes-Benz became the first to offer Level 3 autonomy in the U.S. when its self-driving system, Drive Pilot, was approved for use in Nevada last month.
Mercedes said during the Consumer Electronics Show (CES) in Las Vegas that the state’s Department of Motor Vehicles approved its application and was preparing the certificate of compliance. The DMV will allow Mercedes to self-certify that the Level 3 autonomous system is safe for use on public roads. Mercedes is also seeking approval for Drive Pilot use in California.
Meanwhile, General Motors and Ford want to manufacture up to 2,500 Level 4 autonomous vehicles a year under temporary federal motor vehicle safety standards (FMVSS) exemptions both are seeking from the National Highway Transportation Safety Administration (NHTSA).
The exemptions would be for two years. Reuters reports that GM first sought approval in 2018 for the release of the vehicles in 2019 but that mark wasn’t met and is now slated to begin in spring 2023.
The road toward greater automation hasn’t been a smooth one, with NHTSA warning last week that Tesla’s Full Self-Driving software may cause crashes when issuing a recall on 362,758 vehicles of varying model years from 2017-2023.
The FSD Beta software allows equipped Teslas “to exceed speed limits or travel through intersections in an unlawful or unpredictable manner,” increasing the risk of a crash, NHTSA said.
The recall has to do with steering and electrical system issues in 2016-2023 Model S and Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with FSD Beta software or pending installation.
“The FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution,” the recall notice states. “In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits.”
NHTSA said in December that it was investigating two more fatalities allegedly related to Tesla’s driver assistance features. According to Reuters and other news outlets, one crash happened on Nov. 24 and involved eight vehicles in San Francisco including a 2021 Tesla Model S, the driver of which said the FSD feature malfunctioned. The other recent crash involved a 2020 Model 3 in Ohio and resulted in one minor injury.
CNBC and other media outlets reported at the time that NHTSA is investigating at least 41 crashes involving Tesla vehicles where automated features including Automatic Emergency Braking (AEB), Autopilot, FSD, and FSD Beta were involved. Since 2016, 19 crash fatalities have been reported as part of Tesla-related investigations, according to Reuters.
Setbacks in automation
Tesla’s latest recall highlights a bigger issue with the ways partial automation systems are designed and advertised, said the Insurance Institute for Highway Safety (IIHS).
“The partial automation systems on vehicles today require the driver to be fully engaged in the driving task at all times and retake control when necessary,” said David Harkey, IIHS president. “Institute research shows that drivers who use partial automation on a regular basis often treat their vehicles as fully self-driving despite widespread warnings and numerous high-profile crash reports. However, none of the current systems is designed to replace a human driver or to make it safe for a driver to perform other activities that take their focus from the road.”
He said the IIHS has been working on a safeguard rating system to address how effectively partially automated vehicles can keep drivers engaged.
“Fully attentive drivers could prevent their vehicles from doing the things cited in the recall,” Harkey said. “The main problems for Tesla’s system include the misleading names of ‘Full Self Driving’ and ‘Autopilot’ and the fact that it does not have adequate safeguards to ensure drivers will pay full attention to the road.”
Ford’s BlueCruise hands-free highway driving system, in operation in a Mustang Mach-E. (Provided by Ford)