Tesla driver ignored repeat Autopilot warnings before fatal crash

Chris Davies - Jun 20, 2017, 10:15am CDT
Tesla driver ignored repeat Autopilot warnings before fatal crash

The driver killed in a high-profile Tesla crash last year repeatedly ignored warnings by the semi-autonomous Autopilot system to take control, US investigators have concluded. The National Transportation Safety Board (NTSB) had been digging into the May 2016 incident in which Ohio resident Joshua Brown’s Tesla Model S collided with a tractor-semitrailer truck near Williston, Florida, and has now released its accident docket. More than 500 pages of information have been revealed about the fatal crash, including technological reports, interview transcripts, and photographs.

Brown’s crash threw Autopilot and semi-autonomous driver assistance aids into the headlines, and their efficacy and safety back into question. Autopilot, though arguably the most advanced of such technologies available in production cars today, still falls short of full autonomy. Described as a Level 2 system according to the SAE definition, it can maintain the pace of surrounding traffic – braking the accelerating accordingly – as well as track the lines of the road to stay in lane, and even automatically change lanes with a tap of the indicator stalk.

While Tesla is clear that drivers of its electric vehicles can take their hands off the wheel for extended periods when Autopilot is engaged, their monitoring of the road conditions are still required. As a result, Autopilot will switch between multiple states, including “hands not required” and “hands required”, prompting the driver to apply torque to the wheel to register their awareness. Everything Autopilot does is logged by Tesla’s computers, and it’s that data which has proved essential to the NTSB figuring out what happened.

“For the vast majority of the trip, the AUTOPILOT HANDS ON STATE remained at HANDS REQUIRED NOT DETECTED,” the NTSB writes. “Seven times during the course of the trip, the AUTOPILOT HANDS ON STATE transitioned to VISUAL WARNING. During six of these times, the AUTOPILOT HANDS ON STATE transitioned further to CHIME 1 before briefly transitioning to HANDS REQUIRED DETECTED for 1 to 3 seconds.”

In short, Brown ignored repeated – and escalating – warnings, both visual and auditory, to take the wheel and prove his involvement in the driving process. In the moments before his car struck the side of the truck turning across the lane, sheering off the roof in the process, Brown did not touch the brakes. Nor were the headlights active at the point.

What’s also been seemingly been confirmed is that, despite reports at the time, Brown was not watching a movie on a portable DVD player. Suggestions had been made from early responders to the crash scene that the driver had been distracted by watching “Harry Potter” at the time of the crash. It’s still unclear what happened in the seven second period before the incident that, the NHTSA has suggested, the impending collision would’ve been recognizable to Brown.

However, the NTSB is clear that, while the evidence suggests that Brown ignored the Autopilot warnings, this isn’t the time for conclusions. Right now, in fact, it’s merely a summary of all of the factual information, and Tesla’s technology is yet to be absolved of any blame in the crash.

“The docket contains only factual information collected by NTSB investigators; it does not provide analysis, findings, recommendations, or probable cause determinations,” the agency warns. “No conclusions about how or why the crash occurred should be drawn from the docket. Analysis, findings, recommendations, and probable cause determinations related to the crash will be issued by the Board at a later date.”

MORE NTSB


Must Read Bits & Bytes

8 Responses to Tesla driver ignored repeat Autopilot warnings before fatal crash

    • If I had written the software that’s exactly something I would put in. The worst case scenario is that despite all warnings there’s no discernible action on the part of the driver and that requires a slowing of the car and or shut down and apply brakes.

      • In this case, it would be a matter of only slowing down enough to allow space between the moving object ahead of you. A complete stop would be dangerous as you’d have some speeding idiot behind you plowing into the back of your car. He’d be thinking there’s no way you’d be slowing down to a stop on a highway.

        I don’t doubt well-written software will likely make an autopilot very safe but it’s still going to be a matter of faith to trust in its capabilities. I’m not even all that comfortable riding in cars with another person driving although they could certainly be a better drive than me.

        • I can agree with that.

          I’m the same way as a passenger. I need my own steering wheel and pedals because the imaginary ones don’t work too well.

  1. I’m sad to see this person was killed but he was simply too much of a Tesla fanboi to listen to reason. For what other reason would he ignore such warnings? Only he can be blamed for his foolish belief in the autopilot system would somehow make the proper decision to avoid a collision. He would have easily been able to take over control and avoid a collision. Maybe his situation is unique but it really frightens me to imagine riding in a car that has full control to make decisions. I have no doubt my thinking is antiquated but I can’t help that. It would probably take me a long time to get used to an autopilot after driving over 50 years using my own judgment. My last thoughts would be, “Why isn’t the car stopping when it’s so obvious there’s a huge object directly in front of it.”

Leave a Reply

Your email address will not be published. Required fields are marked *