The National Highway Traffic Safety Administration has closed a long-running investigation into Tesla’s Autopilot driver-assistance system after looking into hundreds of crashes involving its misuse, including 13 that were fatal and “many others with serious injuries.”
At the same time, NHTSA is opening a new investigation to assess whether the Autopilot recall fix Tesla implemented in December is effective enough.
NHTSA’s Office of Failure Investigation said in documents released Friday that it completed “an extensive body of work” that turned up evidence that “Tesla’s weak driver engagement system was not suitable for Autopilot’s permitted operating capabilities.”
“This mismatch has led to a critical safety gap between driver expectations [Autopilot’s] operational capabilities and actual capabilities of the system,” the agency wrote. “This loophole led to predictable misuse and avoidable accidents.”
The closure of the original investigation, which began in 2021, marks the end of one of the government’s most visible efforts to control Tesla’s Autopilot software. The Justice Department is also looking into the company’s claims about the technology, and the California Department of Motor Vehicles has accused Tesla of falsely advertising the capabilities of Autopilot and its more advanced Full Self-Driving beta software. Tesla, meanwhile, is now “balls to the wall for autonomy,” according to CEO Elon Musk.
NHTSA said its investigation looked at 953 reported crashes through Aug. 30, 2023. In about half (489) of those, the agency said there was either “insufficient data to make an assessment,” the other vehicle was at fault, Autopilot was found not to be in use or the crash was unrelated to the detector.
NHTSA said the remaining 467 crashes fell into three buckets. There have been multiple (211) accidents where “Tesla’s frontal plane struck another vehicle or obstacle with sufficient time for a careful driver to react to avoid or mitigate the collision. It said 145 accidents involved “road departures in low-grip conditions, such as wet roads. And he said 111 of the crashes involved “road departures where Autosteer was inadvertently disengaged from driver inputs.”
Tesla tells drivers to keep their eyes on the road and their hands on the wheel while using Autopilot, which it measures through a torque sensor and, in its newer cars, the in-cabin camera. But NHTSA and other safety groups have said those warnings and checks don’t go far enough. In December, NHTSA said those measures were “inadequate to prevent misuse.”
Tesla agreed to issue a recall through a software update that would theoretically increase driver monitoring. But that update didn’t seem to change Autopilot much — a sentiment NHTSA seems to agree with.
Parts of this recall fix require the “owner to opt in,” and Tesla allows the driver to “easily reverse” some of the safeguards, according to NHTSA.
This story is evolving…