Tesla Autopilot faces US investigation over first responder crashes

The US government has opened an investigation into Tesla's Autopilot, after multiple crashes where the electric cars have driven into first responder vehicles. There have been 11 incidents recorded by the NHTSA since January 2018, the agency confirmed, and as a result it's now looking at Autopilot on all four Tesla models from the past 7 years.

That covers model years 2014-2021, the National Highway Traffic Safety Administration confirms, across Tesla's Model S, Model X, Model 3, and Model Y. Altogether it's an estimated 765,000 vehicles.

"Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones," the NHTSA explains. "The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes."

Of the 11 crashes, 7 saw injuries sustained. In total, there have been 17 injuries, including one fatality. The earliest recorded was in January 2018, in California, while the most recent took place in July of this year. They're spread across multiple states, including Florida, Michigan, Arizona, Indiana, and Connecticut.

However the NHTSA is leaving the path open to more crashes being included, as it looks into what went on with Autopilot and whether the situation matches any other similar incidents.

"The investigation will assess the technologies and methods used to monitor, assist, and enforce the driver's engagement with the dynamic driving task during Autopilot operation," the agency says. "The investigation will additionally assess the OEDR by vehicles when engaged in Autopilot mode, and ODD in which the Autopilot mode is functional. The investigation will also include examination of the contributing circumstances for the confirmed crashes listed below and other similar crashes."

Autopilot is a Level 2 system by the generally-accepted SAE guidelines to autonomous driving. That means it's driver assistance, not a self-driving car: the person at the wheel is still responsible for the vehicle's operations. Tesla uses sensors in the steering wheel to track that driver engagement, though the amount of time you can remove your hands varies by situation.

Although Tesla has recently been pushing out its so-called "full self-driving" system to some EV owners, despite the name that's still only a Level 2 system. The definition of Level 3 or above is that, in circumstances where the vehicle is considered "autonomous," it does not require human monitoring. Tesla's guidance to those with the latest beta software is that they are still responsible for the vehicle's operations.

That proviso, and the boasts and promises of Tesla and company CEO Elon Musk specifically, has led to ongoing confusion over the years about just what Autopilot is, and isn't, capable of. Multiple reports of Tesla owners driving in an unsafe manner, leaving the driver seat and moving elsewhere in the vehicle, sleeping at the wheel, or otherwise not paying sufficient attention have been recorded. The fallibility of Autopilot itself has also been demonstrated with several crashes.

Tesla has generally responded with a reminder that – regardless of what Musk might imply, or the presence of the expensive Full Self-Driving Package option that new cars can be purchased with – Autopilot is still intended to be assistance not a replacement for human operations. The automaker is probably the most aggressive in the industry when it comes to pushing out over-the-air updates to its vehicles, many of them adding tweaks or new abilities to Autopilot.

The company has not commented on the NHTSA investigation, and nor has Elon Musk. Having disbanded its PR team in recent years, public announcements typically come via the outspoken chief executive. Earlier this month, as part of its 2020 Impact Report, Tesla claimed that vehicles "with Autopilot engaged experienced 0.2 accidents per million miles driven."

Back in June, the NHTSA announced new rules around incident reporting for vehicles with Level 2 systems. Serious crashes or incidents where adaptive cruise control and lane-keeping assistance were active must now be reported by automakers, the agency said, as it looks at whether such systems could "present safety risks to occupants of those vehicles and other roadway users, in part due to the unconventional division of responsibility between the vehicle and its human driver." It follows criticism of the agency by the NTSB early in 2020, which argued that investigations into crashes where Autopilot had been active at the time had been insufficiently thorough.