The Unexpected Truth About Self-Driving Cars According To General Motors

The future is self-driving, or so say evangelists that are betting big on a future where AI-powered machines — including vehicles — will perform intended tasks on their own. Self-driving cars just happen to be one of the hottest topics of debate right now, and for multiple reasons. On the positive side, autonomous cars are expected to heavily reduce the instances of accidents triggered by human negligence, improve fuel economy, tone down stop-and-go waves, boost the productivity of folks in the cabin, save on driver costs, and a lot more. On the flip side, self-driving cars lack that one critical element we know as human division-making, one that springs into action, especially for "edge" scenarios.

For a large portion of industry experts, human supervision will continue to be a necessity for self-driving cars. Cruise co-founder and CEO Kyle Vogt is among them. As per a Reuters report, when Vogt was asked about the point of human oversight behind the wheels of a self-driving car, Vogt remarked that he doesn't see a valid point as to why he would want to get rid of the human touch when it comes to wrestling back the control if things go haywire for a self-driving vehicle. "I can provide my customers peace of mind knowing there is always a human there to help if needed," he further added.

Pay heed to the Cruise chief

Now, Vogt's statement carries some weight, especially when it comes from his stature as the leader of a company that has been a pioneer in self-driving car technology. Cruise got the official state nod to begin testing self-driving EVs without any human backup driver all the way back in 2020, and that too, on the busy streets of California. A year later, Cruise became the first company to be granted a driverless autonomous service permit for testing self-driving passenger cars by the California Public Utilities Commission.

In February 2022, Cruise put its fleet of self-driving rides on the streets in San Francisco. The company has even developed its own self-driving chips that will make an appearance in cars starting in 2025. The journey, though, has been far from smooth. Cruise had to recall 80 of its self-driving cars citing a software issue over incorrect prediction for incoming cars that reportedly resulted in a couple of injuries

The GM-owned company, which aims to put a million autonomous cars in the public realm by 2030, also faced some embarrassing scenarios in the not-too-distant past. In April 2022, cops pulled over a Cruise self-driving car in a bizarre turn of events. A couple of months later, the company courted heat as a swarm of Cruise cars hit by a server-side glitch held up traffic for hours in San Francisco. These flubs only prove that self-driving cars need human assistance, at least for now.

What do experts think?

Despite advancements such as enhanced night sight using machine learning, research sides with caution, while major stakeholders such as Vogt and Elon Musk remain divided over the future. Research published in PLOS ONE concluded that human behavioral patterns while driving are vastly different from the decision-making strategy at the heart of self-driving systems. Another bit of research published in Transportation Research Procedia notes that despite all the technological progress, autonomous vehicles "would not lead to an efficient autonomous driving." 

Moreover, the riders might become overconfident in a self-driving car's abilities and ditch critical measures like seatbelts, while pedestrians might get more reckless assuming that — by protocol — an autonomous car won't run over them. University of Leeds researchers also surmised that "safe and human-acceptable interaction with pedestrians is a major challenge for developers of automated vehicles." 

Notable research coming from Germany that was published in the Frontiers in Psychology offers excellent insights into how human moral decisions are critical to the future of self-driving cars. More specifically, the exhaustive research paper highlighted "issues with creating decision-making algorithms that attempt to simultaneously consider intuitions" for self-driving cars in the immediate future. 

In a paper presented at 2018's Probabilistic Safety Assessment and Management conference in Los Angeles, experts warned that "AVs are more likely to be the cause of a crash." Another paper discussed at the Australasian College of Road Safety Conference in 2015 highlighted that human intervention is necessary for the first generation of self-driving vehicles.

What the public has to say

Interestingly, the perception of self-driving cars in the eyes of the general public has remained chaotic in the past few years. In a Pew poll from 2014, only 50% of surveyed Americans expressed willingness to ride in a self-driving car. In a 2017 poll, 54% of the surveyed adults in the U.S. expressed concerns over the adoption of self-driving cars, while 56% said they wouldn't want to hitch a ride in such a vehicle. 

In a public analysis, experts from the University College London found that 86% of the U.K. population wants self-driving cars to be labeled, while 70% of the surveyed audience was of the opinion that autonomous cars "will need to 'understand' the intentions of people at the side of the road." Covering the American, Australian, and the British populace, a survey courtesy of the University of Michigan found that a majority of respondents are uneasy with the idea of stepping inside a self-driving car and expressed "high level of concern" over cars lacking driver controls. 

Another Pew research that came out in March 2022 saw "only" 44% of Americans perceiving self-driving cars as a bad idea, but a higher 66% of the surveyed people said they wouldn't want to move around in one such car. A subsequent poll published in August 2022 recorded 45% of the surveyed U.S. adults as being uncomfortable with the idea of traveling in an autonomous vehicle.