The FSD Beta software allows equipped Teslas “to exceed speed limits or travel through intersections in an unlawful or unpredictable manner,” increasing the risk of a crash, NHTSA said.
The recall has to do with steering and electrical system issues in 2016-2023 Model S and Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles equipped with FSD Beta software or pending installation.
“The FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution,” the recall notice states. “In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits.”
NHTSA said in December that it was investigating two more fatalities allegedly related to Tesla’s driver assistance features.According to Reuters and other news outlets, one crash happened on Nov. 24 and involved eight vehicles in San Francisco including a 2021 Tesla Model S, the driver of which said the FSD feature malfunctioned. The other recent crash involved a 2020 Model 3 in Ohio and resulted in one minor injury.
CNBC and Reuters reported at the time that NHTSA is investigating at least 41 crashes involving Tesla vehicles where automated features including Automatic Emergency Braking (AEB), Autopilot, FSD, and FSD Beta were involved. Since 2016, 19 crash fatalities have been reported as part of Tesla-related investigations, according to Reuters.
In June, NHTSA upgraded its Autopilot probe of 830,000 Tesla vehicles involving crashes with parked emergency vehicles. The upgrade meant the administration would “extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and …explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision,” NHTSA said at the time.
The engineering analysis is still open. NHTSA said on Feb. 15 that it “plans to continue its assessment of vehicle control authority, driver engagement technologies, and related human factors considerations.”
Tesla’s website states, “Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
That description coincides with how NHTSA describes FSD Beta in the recall notice, as an SAE Level 2 driver support feature. NHTSA says the feature “can provide steering and braking/acceleration support to the driver under certain operating limitations.”
“With FSD Beta, as with all SAE Level 2 driver support features, the driver is responsible for operation of the vehicle whenever the feature is engaged and must constantly supervise the feature and intervene (e.g., steer, brake or accelerate) as needed to maintain safe operation of the vehicle,” the notice states.
To fix the issues, Tesla says it will release an over-the-air (OTA) software update, free of charge. Owner notification letters are expected to be mailed by April 15. Owners or repairers who have questions or concerns about the recall can contact Tesla’s customer service at 1-877-798-3752 with recall number SB-23-00-001 or NHTSA’s Vehicle Safety Hotline at 1-888-327-4236.
On Thursday, CEO Elon Musk responded to the recall on Twitter: “The word ‘recall’ for an over-the-air software update is anachronistic and just flat wrong!”
During Tesla’s Q4 2022 earnings call, held on Jan. 25, he said FSD had been deployed for city streets to roughly 400,000 customers in North America.
“This is a huge milestone for autonomy as FSD Beta is the only way any consumer can actually test the latest AI-powered autonomy,” he said. “And we’re currently at about 100 million miles of FSD outside of highways. And our published data shows that improvement in safety statistics, it’s very clear. So we would not have released the FSD Beta if the safety statistics were not excellent.”
New York Times technology reporter Cade Metz recently spoke to former NHTSA employee, Missy Cummings, who said last fall, she sent a telling two-page data analysis of nearly 400 crashes involving advanced driver assistance systems (ADAS) in Teslas and General Motors, and Ford vehicles, specifically Tesla’s Autopilot and GM’s Super Cruise. The Ford ADAS that was analyzed wasn’t revealed in the Times article.
Cummings said the report “indicated that drivers were becoming too confident in the systems’ abilities and that automakers and regulators should restrict when and how the technology was used.”
“Car companies — meaning Tesla and others — are marketing this as a hands-free technology,” she said. “That is a nightmare.”
The analysis Cummings sent showed that when vehicles using the systems were involved in fatal crashes, they were traveling over the speed limit 50% of the time and in crashes that resulted in serious injuries, they were speeding 42% of the time, according to the Times. Crashes that didn’t involve ADAS were 29% fatal and caused 13% caused serious injuries.
Featured image: Tesla Model S, model year not specified. (Credit: Tesla)