Waymo recreated fatal crashes putting its software at the wheel - Here's how it did

Waymo is tackling the safety issue of autonomous vehicles head-on, using simulations to replay fatal crashes but replacing the human driver involved with the Alphabet company's software, to show what the Waymo Driver would've done differently. The research looked at every fatal accident recorded in Chandler, Arizona – where the Waymo One driverless car-hailing service currently operates – between 2008 and 2017.

"We excluded crashes that didn't match situations that the Waymo Driver would face in the real world today, such as when crashes occurred outside of our current operating domain," Trent Victor, Director of Safety Research and Best Practices at Waymo, explains. "Then, the data was used to carefully reconstruct each crash using best-practice methods. Once we had the reconstructions, we simulated how the Waymo Driver might have performed in each scenario."

In total, there were 72 different simulations that the system needed to handle. In those where there were two cars involved, Waymo modeled each in two ways. First, where the Waymo Driver was in control of the "initiator" vehicle, which initiated the crash, and then again with it as the "responder" vehicle, which responds to the initiator's actions. That took the total to 91 simulations.

The Waymo Driver avoided every crash as initiator – a total of 52 simulations – Waymo says. That was mainly down to the computer following the rules of the road that human drivers in the actual crashes did not, such as avoiding speeding, maintaining a gap with other traffic, and not running through red lights or failing to yield appropriately.

On the flip side, where the Waymo Driver was the responder, it managed to avoid 82-percent of the crashes in the simulations. According to Waymo's Victor, "in the vast majority of events, it did so with smooth, consistent driving – without the need to brake hard or make an urgent evasive response."

In a further 10-percent of the simulations, the Waymo Driver was able to take action to mitigate the crash's severity. There, the driver was 1.3-15x less likely to sustain a serious injury, Waymo calculates.

Finally, in the remaining 8-percent of crashes simulated, the Waymo Driver was unable to mitigate or avoid the impact. They were all situations where a human-operated vehicle struck the back of a Waymo vehicle that was stationary or moving at a constant speed, this "giving the Waymo Driver little opportunity to respond," Victor explains.

That is equally important, Waymo argues, because when they finally launch in any significant number, autonomous vehicles are going to have to coexist with human drivers on the road for some time to come. Those human drivers can't be counted on to follow the same rules as stringently as Waymo's software demands.

Waymo has released a paper, detailing its findings. Part of the challenge for assessing autonomous vehicles, it argues, is that high-severity collisions are thankfully relatively rare in the real world. As such, "evaluating effectiveness in these scenarios through public road driving alone is not practical given the gradual nature of ADS deployments."