Tesla: Autopilot was on – and ignored – in fatal Model X crash

Chris Davies - Mar 30, 2018
2
Tesla: Autopilot was on – and ignored – in fatal Model X crash

Tesla has confirmed that Autopilot was enabled on the Model X that crashed, fatally, in California last week, though says the driver ignored multiple warnings before the incident. The crash saw driver Wei Huang collide with a concrete highway lane divider last Friday, March 23. He later died in hospital from his injuries.

At the time, Tesla blamed the severity of the crash for delaying its access to the logs that the Model X records during use. However, it did have some statistics on the stretch of road in question. Since the start of the year, the automaker said, cars with Autopilot enabled have driven that portion of the highway roughly 20,000 times. Over 200 successful Autopilot trips are carried out daily on the stretch.

Now, in a new blog post published this evening, Tesla has revealed more details now that it has access to the logs. Most notably, the data confirms that Autopilot was, indeed, active during the time of the incident. There are, though, indications that the driver was not sufficiently engaged with the system.

“In the moments before the collision, which occurred at 9:27 a.m. on Friday, March 23rd, Autopilot was engaged with the adaptive cruise control follow-distance set to minimum,” Tesla said today. “The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.”

Tesla’s system, like other adaptive cruise control technologies in a variety of cars, uses a range of alerts and warnings to bring the driver’s attention back to the road. In the case of Autopilot specifically, sensors in the wheel monitor whether – as recommended – the driver has at least one hand in contact. If they remove their hands for an extended period, the length of which depends on the nature of the road and the speed that the car is moving at, they get a visual warning on the dashboard display. That’s followed by audio alerts, and finally the car is designed to automatically bring itself to a complete stop.

Huang, though, apparently ignored both the warnings and the approaching hazard. “The driver had about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator,” Tesla writes, “but the vehicle logs show that no action was taken.”

Tesla continues to blame the severity of the incident on the crash attenuator. A safety barrier built into the concrete highway divider, it’s designed to absorb forces during a collision, crumpling before the car reaches the concrete itself. However, Tesla points out, in this situation the attenuator had previously been destroyed in an earlier, unconnected incident, but not yet been replaced.

“We have never seen this level of damage to a Model X in any other crash,” Tesla points out.

It’s the second fatal crash to have taken place involving a Tesla in Autopilot mode. Back in May 2016, a Model S sedan collided with a truck in Florida, after neither the driver, Joshua Brown, nor the car’s systems spotted it crossing the highway ahead. After an investigation, Tesla concluded that the Model S’ camera had not been able to sufficiently differentiate between the white truck and the bright sky behind it.

Tesla pushed out a firmware update to modify Autopilot’s behavior. In the meantime, the National Highway Traffic Safety Administration (NHTSA) undertook its own investigation, concluding that Autopilot was not to blame. It also made the now oft-repeated comment that it believed Autosteering Tesla cars contributed to a 40-percent reduction in crashes.

The National Transportation Safety Board has confirmed it is investigating this latest crash. Earlier this week, it said it was unclear as to whether Autopilot was active. It also cited a fire in the car, which Tesla says was slow-burning and only became an issue when all occupants were away from the vehicle. That, the automaker points out, is as it’s designed to behave.

The fatality comes at a precarious time for autonomous driving technologies. Though Autopilot is only intended as a driver-assistance aid – and indeed there are warnings both in the car’s handbook and displayed on its touchscreen display cautioning that attention is still required – it’s likely to be compared to the death caused by an Uber driverless car in Tempe, Arizona earlier this month. Investigations there are still underway to understand what happened when a woman walked out in front of the autonomous Volvo SUV running Uber’s hardware and software suite, and the car failed to stop in time to avoid a collision.


Must Read Bits & Bytes