Tesla has been accused of falsely advertising its Autopilot and Full Self-Driving advanced driver assistance system (ADAS) features in California.
Complaints filed with the state’s Office of Administrative Hearings by the Department of Motor Vehicles and obtained by The Los Angeles Times state Tesla “made or disseminated statements that are untrue or misleading, and not based on facts.”
The LA Times reports that the complaints specifically reference the following, which is listed on Tesla’s website: “All you will need to do is get in and tell your car where to go. If you don’t say anything, your car will look at your calendar and take you there as the assumed destination. Your Tesla will figure out the optimal route, navigating urban streets, complex intersections and freeways.”
A DMV spokesperson told the publication on Friday, “the DMV will ask that Tesla will be required to advertise to consumers and better educate Tesla drivers about the capabilities of its ‘Autopilot’ and ‘Full Self-Driving’ features, including cautionary warnings regarding the limitations of the features, and for other actions as appropriate given the violations.”
This isn’t the first time Tesla has come under fire for misleading the public about its ADAS features in addition to reports of improper functionality.
In June, the National Highway Traffic Safety Administration (NHTSA) upgraded its preliminary evaluation of Autopilot, which began in August 2021, to an engineering analysis. The evaluation was “motivated by an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to pre-existing collision scenes.”
The upgrade means the administration will “extend the existing crash analysis, evaluate additional data sets, perform vehicle evaluations, and to explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks by undermining the effectiveness of the driver’s supervision.”
The Associated Press and several other news outlets reported in May that NHTSA is also investigating a Tesla crash that killed three in California for possibly involving a partially automated driving system. Autopilot has also come under scrutiny by NHTSA for crashing into emergency vehicles parked along roadways and braking for no apparent reason.
For Tesla vehicles to be fully autonomous, they would need to be SAE Level 3 ADAS, meaning the person in the driver’s seat wouldn’t be required to take over driving, according to the Society of Automotive Engineers (SAE) International. NHTSA considers Teslas to be SAE Level 2, which means the person in the driver’s seat must at all times supervise steering, braking, and acceleration and be prepared to take over.
Tesla Chief Executive Elon Musk confirmed on Twitter in January that Full Self-Driving hasn’t caused any injuries or deaths in any Tesla crashes but The LA Times found that at least eight crash reports submitted to federal safety regulators by Tesla owners state otherwise.
In February, Tesla recalled 53,822 model year 2016-2022 Model S and Model X, 2017-2022 Model 3, and 2020-2022 Model Y vehicles equipped with a test version of its Full-Self Driving software because it can allow the vehicle to roll through four-way stop signs. The agency said Tesla would disable the setting that allows rolling stops through an over-the-air (OTA) software update.
As a means to clear up confusion about the functionality of ADAS features, a coalition made up of AAA, Consumer Reports, J.D. Power, the National Safety Council, PAVE, and SAE International announced in July its desire for OEMs to adopt universal ADAS terms.
The coalition’s set of updated and expanded terms includes six major categories — collision warning, collision intervention, driving control assistance, parking assistance, other driver assistance systems, and driver monitoring.
Featured image: Lombardia, Italy – January 17, 2022: Close-up of a Tesla Model 3 driving down a highway on Autopilot. (Credit: Aranga87/iStock)