U.S. regulators have opened up the biggest review of Tesla’s Autopilot since the driving-assistance feature was introduced, centered around a string of collisions with emergency vehicles that resulted in numerous injuries and at least one fatality.
The National Highway Traffic Safety Administration says its Office of Defect Investigations opened the review on August 13 to look into 11 crashes involving Teslas hitting vehicles at “first responder scenes.” An estimated 765,000 vehicles produced between 2014 and 2021 including every model in Tesla’s lineup—the Model S, X, 3 and Y—are covered in the investigation.
It will “assess the technologies and methods used to monitor, assist and enforce the driver’s engagement with the dynamic driving task during Autopilot operation,” NHTSA said. The investigation will assess the effectiveness of the system’s “Object and Event Detection and Response” and circumstances under which Autopilot is designed to be functional.
Tesla didn’t respond to a request for comment. Neither it nor CEO Elon Musk commented on the investigation via social media channels.
The investigation is the biggest in company history and comes as Tesla expands availability of its so-called Full Self-Driving feature, which the company has told California regulators isn’t actually autonomous driving technology. NHTSA in January 2017 concluded an investigation into a Model S crash in Florida that killed the vehicle’s owner, who was using Autopilot at the time, without finding the company at fault. The family of a Model X driver who was killed in a Silicon Valley crash while using the system sued Tesla in May 2019, saying the technology is defective.
Tesla involved in each of the 11 crashes under review were using either Autopilot or Traffic Aware Cruise Control when the accidents happened. NHTSA early this year said it was reviewing 23 accidents in which Autopilot may have been involved.
Since the system was introduced seven years ago, safety advocates have raised concerns that the Autopilot name suggests the driver-assistance feature can lead many users to place too much confidence in it. That’s been borne out by numerous Tesla owners posting videos over the years treating Autopilot as an autonomous system, with some sleeping at the wheel or even sitting in the back seat while traveling down the highway. Autopilot is an 2 Advanced Driver Assistance System, or ADAS, and the company warns users that they are to remain ready to take control of the vehicle at all times.
In 2020 a German court determined that calling the system Autopilot was misleading to consumers, and banned Tesla from using both that term and Full Self Driving for vehicles sold in that market.
“Tesla has treated its customers like guinea pigs and deployed a faulty technology that can kill people with the false promise it is an Autopilot,” Jamie Court, president of Consumer Watchdog, a Los Angeles-based safety advocacy group, said in a statement on Monday.
Musk frequently touts Tesla as a leader in both artificial intelligence and autonomous driving technology, though the company hasn’t provided the same type of safety data that developers of of AV tech, including Waymo, Cruise, Argo AI and other companies regularly share with entities including California’s Department of Motor Vehicles.
Additionally, the NHTSA probe was announced just days ahead of Tesla’s “AI Day,” set for August 19. Musk typically uses such events to promote Tesla technology.
Shares of Tesla fell 4.1%, to $687.68, at 2:13 p.m. EDT in Nasdaq trading. The decline caused Musk’s fortune to drop $8.2 billion to $178.8 billion as of 11:15 a.m. EDT.