Intel has responded to last week’s fatal accident involving an Uber self-driving vehicle. In a lengthy blog post published this evening, Mobileye CEO Prof. Amnon Shashua talks about autonomous car technology and the wider reality around it, not the least of which is transparency. “I firmly believe the time to have a meaningful discussion on a safety validation framework for fully autonomous vehicles is now,” Shashua said.
The commentary follows an accident last week that involved an Uber self-driving vehicle striking pedestrian Elaine Herzberg while she crossed a street. The vehicle was in autonomous mode when the accident happened, making it the first death in the US to result from a self-driving car. The accident raised concern among some segments of the public who fear the technology may not be ready for public roads.
In his post, Shashua shared “a few observations around the meaning of safety with respect to sensing and decision-making.” Among those observations is one related to the difficulties presented in interpreting sensor information. He points toward advanced driver assistant systems, also called ADAS, which are used for things like the automatic emergency braking feature available in a growing number of vehicles.
“It is the high-accuracy sensing systems inside ADAS that are saving lives today, proven over billions of miles driven,” says Shashua, who called the technology a “foundational element” of our self-driving future. The need for experts remains vital.
Recent developments in artificial intelligence, like deep neural networks, have led many to believe that it is now easy to develop a highly accurate object detection system and that the decade-plus experience of incumbent computer vision experts should be discounted. This dynamic has led to many new entrants in the field. While these techniques are helpful, the legacy of identifying and closing hundreds of corner cases, annotating data sets of tens of millions of miles, and going through challenging preproduction validation tests on dozens of production ADAS programs, cannot be skipped. Experience counts, particularly in safety-critical areas.
Shashua also touched on the topics of transparency and redundancy, pointing towards the Responsible Sensitive Safety model previously released by Mobileye. “Decision-making must comply with the common sense of human judgement,” he explains.
As far as redundancy goes, Shashua sheds light on what Mobileye is doing in that regard. “To really show that we obtain true redundancy, we build a separate end-to-end camera-only system and a separate LIDAR and radar-only system.”
In this case, true redundancy refers to a system that relies on independent sources of data, namely LIDAR, radar, and cameras, rather than a system where they are all fused together. Such a fused system is “good for comfort of driving but is bad for safety,” said Shashua.
Also included in the write-up is a brief analysis of the video recorded by the Uber vehicle that struck Herzberg. Shashua explains that the company used its own software to analyze the video, and it was able to identify the pedestrian 1-second before the collision took place despite the video’s low quality. It isn’t known whether Uber’s vehicle detected Herzberg before the collision.