Tesla Full Self-Driving feature under NHTSA investigation following fatality, could affect 2.4M vehicles
By onAnnouncements
The National Highway Traffic Safety Administration’s (NHTSA) Office of Defects Investigation (ODI) has launched a new investigation into Tesla’s “Full Self-Driving” (FSD) feature that could affect an estimated 2.41 million vehicles.
ODI says it has identified four Standing General Order (SGO) reports in which a Tesla vehicle experienced a crash after entering an area of reduced roadway visibility conditions with FSD-Beta or FSD-Supervised (also called “Autosteer on City Streets”) engaged.
“In these crashes, the reduced roadway visibility arose from conditions such as sun glare, fog, or airborne dust,” the ODI wrote. “In one of the crashes, the Tesla vehicle fatally struck a pedestrian. One additional crash in these conditions involved a reported injury. The four SGO crash reports are listed at the end of this summary by SGO number.”
ODI opened a “Preliminary Evaluation” of FSD, which is an optional feature in 2016-2024 Models S and X, 2017-2024 Model 3, 2020-2024 Model Y, and the 2023-2024 Cybertruck.
ODI plans to assess:
-
- “The ability of FSD’s engineering controls to detect and respond appropriately to reduced roadway visibility conditions;
- “Whether any other similar FSD crashes have occurred in reduced roadway visibility conditions and, if so, the contributing circumstances for those crashes; and
- “Any updates or modifications from Tesla to the FSD system that may affect the performance of FSD in reduced roadway visibility conditions. In particular, this review will assess the timing, purpose, and capabilities of any such updates, as well as Tesla’s assessment of their safety impact.”
According to Reuters, there have been at least two fatal accidents involving FSD, including in April when a Tesla Model S hit and killed a 28-year-old motorcyclist in the Seattle area.
In October 2023, Reuters reported that Tesla shared the U.S. Justice Department had issued subpoenas related to its FSD and Autopilot systems. Reuters reported in October 2022 that Tesla was under criminal investigation.
In December 2023, more than 2 million Teslas were recalled over the OEM’s “Autosteer” and “Autopilot” features. The recall stemmed from an NHTSA investigation that began in August 2021 following crashes with stationary first-responder vehicles and Tesla vehicles that were operating with Autosteer engaged.
Earlier this year, NHTSA announced it had opened an investigation into the recall remedy after identifying 13 fatal crashes, in which “foreseeable driver misuse of the system played an apparent role.”
Tesla’s owner’s manuals provide information about FSD Supervised and caution it and its associated functions may not operate as intended.
“There are numerous situations in which driver intervention may be needed,” the manual says. “Visibility is critical for Full Self-Driving (Supervised) to operate.”
Examples given (but are not limited to) include:
-
- “Low visibility, such as low light or poor weather conditions (rain, snow, direct sun, fog, etc.) can significantly degrade performance;
- “Interactions with pedestrians, bicyclists, and other road users;
- “Unprotected turns with high-speed cross-traffic;
- “Multi-lane turns;
- “Simultaneous lane changes;
- “Narrow roads with oncoming cars or double-parked vehicles;
- “Rare objects such as trailers, ramps, cargo, open doors, etc. protruding from vehicles;
- “Merges onto high-traffic, high-speed roads;
- “Debris in the road;
- “Construction zones; and
- “High curvature roads, particularly at fast driving speeds.”
“The list above represents only a fraction of the possible scenarios that can cause Full Self-Driving(Supervised) to make sudden maneuvers and behave unexpectedly,” the manual says. “In fact, Model S can suddenly swerve even when driving conditions appear normal and straightforward. Stay alert and always pay attention to the roadway so you can anticipate the need to take corrective action as early as possible. Remember that this is an early access feature that must be used with extra caution.”
The manual also explains how Autosteer works and cautions drivers that there are situations in which the feature is “particularly unlikely to operate as intended,” including:
-
- “Autosteer is unable to accurately determine lane markings. For example, lane markings are excessively worn, have visible previous markings, have been adjusted due to road construction, are changing quickly (lanes branching off, crossing over, or merging), objects or landscape features are casting strong shadows on the lane markings, or the road surface contains pavement seams or other high-contrast lines.
- “Visibility is poor (heavy rain, snow, fog, etc.) or weather conditions are interfering with sensor operation.
- “A camera(s) or sensor(s) is obstructed, covered, or damaged.
- “Driving on hills.
“Approaching a toll booth. - “Driving on a road that has sharp curves or is excessively rough.
- “Bright light (such as direct sunlight) is interfering with the view of the camera(s).
- “The sensors (if equipped) are affected by other electrical equipment or devices that generate ultrasonic waves.
- “A vehicle is detected in your blind spot when you engage the turn signal.
- “Model S is being driven very close to a vehicle in front of it, which is blocking the view of the camera(s).”
The manual notes as well that cameras must be properly calibrated and cleaned regularly.
For example, self-calibration can take up to 100 miles of driving, depending on the road type and condition. Until calibration is completed, Autopilot cannot be engaged, according to the manual.
Images
Featured image: Tesla Cybertruck (Provided by Tesla)