Human rights groups call for ban on automated killer robots

Human rights groups are calling for a universal ban on robotic weapons systems that are able to determine when to fire without human interaction. Clearly, someone has watched The Terminator too many times. Human Rights Watch and Harvard Law School's International Human Rights Clinic have both called for all states to agree to ban the development, production, and use of fully autonomous weapons.

The US military has several robotic weapons systems that are capable of operating autonomously that are equipped with weapons. However, those weapons are controlled by humans when it comes time to attack a target. The same human rights groups want the designers of robots to enact a "code of conduct." That sounds like an effort to prevent Skynet.

The fear is that numerous automated weapons systems in use around the world that currently identify, target, and give humans only a little time to decide whether the target is obliterated might be turned into fully automated systems. Some of those weapons systems may need only a software upgrade. The human rights groups say, "action is needed now, before killer robots cross the line from science fiction to feasibility."

A ban on fully automated weapons systems would reportedly require a new and major arms treaty amongst nations. Those who support the ban point out that fully autonomous weapons systems wouldn't be able to comply with international humanitarian law. Others believe that fully autonomous weapons systems could be more reliable than humans because people can make bad decisions under stress.

John McGinnis, a Northwestern University Law professor, suggests, "artificial-intelligence robots on the battlefield may actually lead to less destruction, becoming a civilizing force in wars."

[via The Globe and Mail]