Fatal accident shows we're not ready for self-driving cars yet

They say that it sometimes takes a tragedy to shock our eyes open. Unfortunately, such tragedies often cost human lives. Uber, who actually hasn't been in the autonomous vehicle scene that long, became the catalyst for such a rude awakening. While the investigation, finger-pointing, and media frenzy is still ongoing, the tragic accident revealed a truth we may not have realized amidst the excitement for the future. And that is that none of us are prepared for what's coming next.

Technology isn't ready

The recent tech and car shows have been full of spiels and self-praise about how manufacturers have achieved this or that level of autonomy, paving the way for the self-driving cars, including driver-less taxis, of the future. While we shouldn't downplay their achievements in pushing autonomous driving technology and hardware forward, it's quite a leap to go from "now" to "the future".

It's still not clear what went wrong in the fatal Uber accident, but it's clear that something did go wrong. A self-driving car should be equipped with sensors that will identify objects, especially pedestrians in the worst conditions possible. That includes conditions where humans would normally fail because they're supposed to be better than us. More importantly, such cars should also be able to make split-second decisions and halt or swerve in an instant. Even just two seconds can be the difference between life or death.

Governments aren't ready

The self-driving craze caught governments unprepared when they started. Many have scrambled to put laws in place to regulate the development and testing of self-driving cars. Now they will be scrambling again to ensure that such accidents won't happen again and that relevant parties will be held liable when they do.

To be fair, the majority of laws and regulations are made in reaction to incidents. Very few have the foresight to act on what has yet to happen. Technology, however, rarely waits for the law to catch up and, just like with drones and UAV, lawmakers and authorities are being challenged to work faster before tragedy strikes for the first, second, or more times.

Drivers aren't ready

Until the results of the investigation are released, we might not know what the safety driver was doing when she wasn't looking on the road. There will undoubtedly be some blame put on her for not being attentive or stopping the car in time. It was, after all, her job.

But here's the thing: the self-driving cars of the future won't have safety drivers. Some might not even have anyone inside capable of driving at all, not to mention any wheel. Many car makers paint a picture of a future where a driver or passenger can simply sit back and relax, maybe chat and play, with nary a worry. This accident flies in the face of that promise. Of course, they paint the ideal future, not what we have at the present. But in between now and then, drivers, or anyone in the position of being able to control the car, will be in a no man's land of being unsure whether to be on even higher alert than usual or putting their full trust in the automation. The Uber safety driver might have been guilty of the latter.

Pedestrians aren't ready

Pedestrians, unfortunately, might be at the losing end of this development in autonomous vehicles. Safety is always of utmost concern and part of that involves not just drivers but also the other humans and living beings outside the vehicle.

The victim in such accidents is often blamed, especially when, at first blush, they seemed to have failed to follow traffic regulations. That seemed to be the case here, until a newly released video called that preliminary judgment into question. Regardless, she may have had the assumption that a driver would react swiftly to avoid her or hit the brakes in time, whether or not she had the right of way. A self-driving car may not make that distinction. Until the day self-driving cars outnumber human-driven cars, pedestrians will always presume there's a driver behind the wheel who will have the instinct to make drastic maneuvers when the situation calls for it. A flawed assumption, but a general one nonetheless.


The tragic accident will undoubtedly put self-driving car development under a microscope and a not so positive light. And the industry might actually need that. As car manufacturers as a whole embrace a feature with less or even no direct driver control, increased scrutiny, testing, and regulation.

It shouldn't however, be used as a reason to put the brakes on self-driving tech. It should, instead, be used as a challenge to make sure that this first accident will also be the last. If self-driving cars are the future, then we have to make sure that no more lives are lost along the way.