Jury: Tesla’s Autopilot did not fail in California crash
By onLegal | Technology
A California jury has ruled that Tesla’s Autopilot was not the cause of a crash that injured a Los Angeles woman in 2019.
The ruling could be viewed as a win for autonomous driving proponents during a divisive time for self-driving technologies and could serve as a precedent on who is responsible during the event of a crash involving semi-autonomous technology.
Another key takeaway from the verdict is how following instructions plays a pivotal role in liability, as Tesla argued the driver in this case went against its user manual in deploying Autopilot on city streets.
For collision repairers, it serves as a reminder on the importance of following instructions carefully when testing, calibrating and reinitializing systems following a collision.
Justine Hsu, who lives in El Monte, sued the OEM after claiming her 2016 Model S’s Autopilot failed to recognize a center median, causing it to crash on July 6, 2019.
The incident happened while Hsu was driving in Arcadia on a flat road “with no sharp curves on the path” in stop-and-go traffic, according to the complaint.
“The Model S was driving between approximately 25 to 30 miles per hour in the far left lane when the Autopilot failed to recognize the center median,” the legal filing said. “Suddenly and without warning, the Autopilot malfunctioned and the Model S swerved into the center median. The driver’s side tire hit the curb of the median, causing the airbags to deploy. The collision happened so suddenly that HSU had no time to react, but she attempted to shield her face from the airbags by releasing her hands from the steering wheel and positioning them in front of her face.”
Her lawyers said the airbag never should have been deployed, given her car hit the median from a left angle, and said when it did go off it “ripped out in a slingshot-like fashion, rather than a plume.”
Hsu suffered numerous jaw breaks and lost “multiple teeth” as a result of the crash, her lawyers said, adding she has undergone three surgeries so far.
The lawsuit said Tesla’s Autopilot was to blame for the crash.
“The defects in the design, manufacture, configuration and [assembly] of the subject vehicle were a substantial factor in causing the vehicle to lose control on the subject roadway and in causing the vehicle to crash,” it said. “The Model S was defective because its design was a substantial factor in causing Hsu’s injuries in the [incident], and because it did not perform as safely as an ordinary consumer would have expected it to perform when used or misused in an intended or reasonably foreseeable way.”
Tesla retorted that it was not responsible for the crash, alleging in a court document that Hsu went against its user manual guidance in deploying Autopilot on city streets.
The jury trial, which included testimony from Tesla engineers, concluded last Friday when Hsu was awarded $0 in damages. She sought $3 million.
Tesla is facing a number of lawsuits related to its advanced driver assistance systems.
In February, its shareholders sued the OEM and its chief executive Elon Musk, claiming that by overselling autonomous capabilities, it caused stocks to depreciate when those features were linked to crashes and federal investigations.
The federal securities class action lawsuit, filed in San Francisco Federal Court, lays out how Tesla has since been touting its “suite of purportedly advanced driver assistance system (ADAS) features” since 2016.
It claimed that throughout the following years, Tesla made “materially false and misleading statements and/or failed to disclose that;”
-
- “Significantly overstated the efficacy, viability, and safety of the company’s autopilot and [full self-driving (FSD)] technologies;”
- “Contrary to defendants’ representations, Tesla’s autopilot and FSD technologies created a serious risk of accident and injury associated with the operation of Tesla vehicles;”
- “All the foregoing subjected Tesla to an increased risk of regulatory and governmental scrutiny and enforcement action, as well as reputational harm; and
- “As a result, the company’s public statements were materially false and misleading at all relevant times.”
Tesla is also the subject of a National Highway Traffic Safety Administration (NHTSA) investigation. The probe relates to fatalities allegedly related to the OEM’s driver assistance features.
According to Reuters and other news outlets, one crash happened on Nov. 24 and involved eight vehicles in San Francisco including a 2021 Tesla Model S, the driver of which said the Full Self-Driving feature malfunctioned. The other recent crash involved a 2020 Model 3 in Ohio and resulted in one minor injury.
CNBC and Reuters report that NHTSA is investigating at least 41 crashes involving Tesla vehicles where automated features including automatic emergency braking, Autopilot, Full Self-Driving (FSD), and FSD Beta were involved. Since 2016, 19 crash fatalities have been reported as part of Tesla-related investigations, according to Reuters.
Last June, NHTSA upgraded its Autopilot probe of 830,000 Tesla vehicles involving crashes with parked emergency vehicles. The upgrade meant the administration would “extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and …explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision,” NHTSA said at the time.
Models affected by that investigation include 2018-2021 Model 3s, 2014-2021 Model Ss, 2015-2021 Model Xs, and 2020-2021 Model Ys.
Meantime, Tesla is under legal federal and state scrutiny including the California DMV’s accusation that the automaker has falsely advertised its Autopilot and FSD features. The accusation is part of complaints filed with the state’s Office of Administrative Hearings by the DMV.
In a class action lawsuit filed last September in the U.S. District Court for the Northern District of California, Tesla is accused of “making misleading and deceptive statements regarding the company’s advanced driver assistance systems (“ADAS”) technology” for years.
“Tesla has deceived and misled consumers regarding the current abilities of its ADAS technology and by representing that it was perpetually on the cusp of perfecting that technology and finally fulfilling its promise of producing a fully self-driving car,” the complaint states. “Although these promises have proven false time and time again, Tesla and Musk have continued making them to generate media attention, to deceive consumers into believing it has unrivaled cutting-edge technology, and to establish itself as a leading player in the fast-growing electric vehicle market.”
IMAGES
Featured image: Jan. 15, 2022: Close-up of a white Tesla Model 3 driving down a road on Autopilot. (Credit: Aranga87/iStock)