Tesla Could Face Criminal Charges Under Reported Self-Driving Investigation

Reuters reports that the U.S. Department of Justice will be investigating Tesla's claims about the electric car company's Autopilot self-driving technologies as they relate to a series of crashes that cast doubt on the program's safety at this time. According to data compiled from the accident reports legally required by the NHTSA, there have been 273 known crashes of Teslas under Autopilot, three of which resulted in serious passenger injury and five of which resulted in at least one death. In light of that information, the legality of Tesla's public statements regarding the reliability and safety of Autopilot is under question.

Reuters' sources are currently anonymous, identified only as "three people familiar with the matter." These sources describe an alleged Department of Justice investigation specifically, addressing potential violations of federal law. As yet, the DOJ has not made an official announcement of any investigation. Such an investigation would presumably address whether Tesla's messaging over the course of the last year constituted an illegal coverup of the alleged flaws in Autopilot.

Legal questions over Autopilot messaging

Tesla has been relentlessly optimistic about autonomous cars and their role in the company's future. As Reuters reports, as far back as 2016 Elon Musk was touting autonomous driving software as "probably better" than humans for driving his company's cars. Last week during an in-house call to discuss quarterly results, Musk confirmed that he expected an upgrade to Tesla's Full Self-Driving software package by the end of 2022 that would "be able to take you from your home to your work, your friend's house, the grocery store without you touching the wheel," according to Reuters. On that call, Musk described government regulation as the primary obstacle to autonomous driving.

While there are no particulars as yet from the Department of Justice, the overall purpose of a criminal investigation of these claims would likely be whether, in light of the widely reported failures of Autopilot, Tesla's claims constitute fraud and/or endanger Tesla drivers with a false characterization of what Tesla's autonomous offerings can do. Much of Tesla's own language stops short of Musk's characterization: the company notes on its website that its onboard software "does not turn a Tesla into a self-driving car nor does it make a car autonomous."

Again, there has been no official announcement of an investigation into Autopilot by the Department of Justice. Should such an investigation take place, it would have to address whether Tesla's optimistic characterization of Autopilot falls into the category of legally protected marketing or crosses the line into fraud.