Repairer Driven News
« Back « PREV Article  |  NEXT Article »

Tesla agrees to recall 54,000 vehicles over ‘rolling stop’ feature in self-driving mode

By on
Announcements | Legal | Technology
Share This:

Tesla is recalling 53,822 vehicles equipped with a test version of its Full-Self Driving software that can allow the vehicle to roll through four-way stop signs, the National Highway Traffic Safety Administration (NHTSA) announced Tuesday.

NHTSA said the recall covers some 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles.

The agency said Tesla would disable the setting that allows rolling stops through an over-the-air software update.

Tesla makes the Full Self-Driving, or FSD, feature available by monthly subscription as part of its Autopilot or Enhanced Autopilot self-driving systems. Autopilot allows hands-off driving under certain circumstances, but still requires the driver to be paying attention and ready to take control of the vehicle.

Tesla, which disbanded its media relations department, did not comment.

FSD allows drivers to choose among three “profiles,” labeled “Chill,” “Average” and “Assertive.” In both Average and Assertive modes, the vehicle “may perform rolling stops,” Tesla documentation says. In Assertive mode, the vehicle will also “have a smaller follow distance, perform more frequent speed lane changes, [and] will not exit passing lanes… .”

According to NHTSA, the feature allows Tesla vehicles to roll through four-way stop signs at speeds up to 5.6 miles per hour, without first coming to a full stop, under certain circumstances.

The idea of a semiautonomous driving system that could deliberately violate traffic laws had generated much discussion in automotive circles in recent weeks. Tesla and NHTSA met on Jan. 10 and Jan. 19 to discuss the issue, the agency said.

On Jan. 20, “a recall determination was voluntarily made to disable the functionality,” NHTSA said.

“The Vehicle Safety Act prohibits manufacturers from selling vehicles with defects posing unreasonable risks to safety, including intentional design choices that are unsafe,” the agency said. “If the information shows that a safety risk may exist, NHTSA will act immediately.”

Regulators and safety advocates have recently focused attention on the potential for misuse of advanced driver assistance system (ADAS) technology, both through deliberate recklessness and misunderstandings about the systems’ capabilities.

The Insurance Institute for Highway Safety (IIHS) announced on Jan. 20 that it will push the industry for adequate safeguards to make certain that the drivers of these vehicles are still paying attention to the road.

“Partial automation systems may make long drives seem like less of a burden, but there is no evidence that they make driving safer,” IIHS President David Harkey said in a statement. “In fact, the opposite may be the case if systems lack adequate safeguards.”

The Tesla FSD situation appears to be different, in that the vehicle can perform in a way considered unsafe by NHTSA even if the driver is paying attention and using the self-driving system as the OEM intended.

More information

Recall notice, NHTSA, Feb. 1, 2022

Click to access RCAK-22V037-9109.pdf

Images

Featured image: The interior of a 2018 Tesla Model S, one of the models subject to Tesla’s recall announced Feb. 1. (Provided by Tesla)

Share This: