Google employees are quitting the company in protest of Project Maven, the artificial intelligence collaboration with the US Department of Defense. Promising to speed up the analysis of drone footage by assessing images for photos or objects, Maven has prompted serious controversy within the search giant among those who disagree with AI being potentially involved in warfare.
Last month, thousands of Google employees signed an internal petition requesting the company cut its contract with the Pentagon. After Maven’s existence was revealed in March of this year, a spokesperson for Google insisted that it would be used “for non-offensive” purposes only. However, in an open letter from disgruntled staff, that promise was questioned.
“While this eliminates a narrow set of direct applications,” the letter countered, “the technology is being built for the military, and once it’s delivered it could easily be used to assist in these tasks. This plan will irreparably damage Google’s brand and its ability to compete for talent.”
Despite the petition, which is said to have reached nearly 4,000 signatures, Google management is said to be unswayed by the negative sentiment. Multiple employees have now resigned, Gizmodo reports, citing either the company’s involvement in Maven specifically, or broader concerns about its more political decisions. Some claim management has become less amenable to listening to such concerns.
Adding to the frustration, Google argues that Maven is actually based on open-source software. If Google took the decision to step away from the contract, the company says, it wouldn’t actually be preventing the Pentagon from using the technology. All it would be doing is avoiding getting paid.
Meanwhile, the International Committee for Robot Arms Control (ICRAC) has waded in, with an open letter to Alphabet and Google execs published in support of the Google employees. It counters Google’s claim that, because of the open-source code, its involvement is less pivotal with Maven.
“Project Maven is a United States military program aimed at using machine learning to analyze massive amounts of drone surveillance footage and to label objects of interest for human analysts,” ICRAC points out. “Google is supplying not only the open source ‘deep learning’ technology, but also engineering expertise and assistance to the Department of Defense.”
While the system may not currently operate without human supervision, ICRAC’s concern is that, over time, that restriction will be marginalized and eventually removed completely. That, it argues, is an inevitability as military commanders realize how much more streamlined the process could be if AIs were left to make targeting decisions and other choices.
“If ethical action on the part of tech companies requires consideration of who might benefit from a technology and who might be harmed, then we can say with certainty that no topic deserves more sober reflection – no technology has higher stakes – than algorithms meant to target and kill at a distance and without public accountability,” ICRAC concludes.
Whether any of this can sway Google remains to be seen. Certainly, losing talent is no small issue for a company, though it may well take a particularly high-profile name citing involvement in Maven as a reason for resigning to really bring attention to the topic.