Killer robots? UN to hold talks on Lethal Autonomous Weapons Systems
The United Nations is set to open official talks on the use of autonomous weapons, but a treaty governing so-called killer robots remains far off, the ambassador leading the discussions said Friday.
Activists and tech leaders including SpaceX CEO Elon Musk have called on the UN to ban fully-automated weapons systems that could revolutionise warfare while putting civilians at heightened risk.
The Convention on Conventional Weapons (CCW) Group of Governmental Experts on Lethal Autonomous Weapons Systems will on Monday begin five days of talks on LAWS under the auspices of the UN Office for Disarmament Affairs. Representatives from more than 70 states are expected to attend, along with others from UN agencies, the International Committee of the Red Cross (ICRC), and activist groups.
Indian ambassador Amandeep Gill, who is chairing the meeting and submitted a food-for-thought paper for the conference, said those calling for a ban will not be satisfied.
“It would be very easy to just legislate a ban but I think … rushing ahead in a very complex subject is not wise,” Gill told reporters. “We are just at the starting line.”
He said the discussion, which will also include civil society and technology companies, will be partly focused on understanding the types of weapons in the pipeline.
Computers cannot be held accountable
Proponents of a ban, including the Campaign to Stop Killer Robots pressure group and Human Rights Watch, insist that human beings must ultimately be responsible for the final decision to kill or destroy. They argue that any weapons system that delegates the decision on an individual strike to an algorithm is by definition illegal, because computers cannot be held accountable under international humanitarian law.
More than 100 artificial intelligence and robotics experts in August signed an open letter warning of a “third revolution” in warfare and urging the UN find the means to “prevent an arms race in these weapons, to protect civilians from their misuse and to avoid destabilizing effects of these technologies.”
The letter warned that lethal autonomous weapons could become “weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”
“As companies building the technologies in artificial intelligence and robotics that may be repurposed to develop autonomous weapons, we feel especially responsible in raising this alarm,” said the letter which was signed by major industry players including Elon Musk and Mustafa Suleyman, founder of Google’s DeepMind and head of its applied AI unit.
ICYMI: Stephen Hawking gives an AI warning while a robot with feelings is unveiled at #WebSummit via @ewither @ReutersTV pic.twitter.com/sejmg2Mjna
— Reuters (@Reuters) November 10, 2017
Military robots widespread ‘within years’
In October, Col. Oleg Pomazuev told Russian news site Military Review that the Russian military will field a new armed robot that “outperformed” manned platforms in exercises. Pomazuev runs the Department of Innovation Research at the Russian military’s Main Directorate of Research Activities, or GUNID.
The Nerehta robot can carry a 12.7mm or 7.62mm machine gun or an AG-30M grenade launcher.
Samuel Bendett, an associate research analyst with the Center for Naval Analyses’ International Affairs Group told DefenseOne that Russian forces tested unmanned ground vehicles during last September’s Zapad-2017 exercises in Belarus.
“They have also been stating for a while that their modernization and state armaments program will include high-tech and unmanned systems,” Bendett said.
General Mark Milley, chief of staff of the U.S. Army said this week that robotic battle systems are proceeding faster than many realize, DefenseOne reported.
“We are in a period of time, historically, where you are getting a convergence of a wide variety of technologies. We are at the leading edge of that,” Milley said. “In combination, I guarantee that they are changing and will change the fundamental character of war.”
“You have a lot of changes in mechanical engineering, in robotics. So autonomous systems, or semi-autonomous, they are already here. They have arrived. They have not proliferated in wide use, yet. They will be, very, very shortly. Within a matter of years, you will see widespread use of robots.”
Humans must remain responsible for decisions
Gill said there was agreement among nations that “human beings have to remain responsible for decisions that involve life and death,” but that there are varying opinions on the mechanics through which “human control” must govern deadly weapons.
The International Committee of the Red Cross, which is mandated to safeguard the laws of conflict, has not called for a ban, but has underscored the need to place limits on autonomous weapons.
“Our bottom line is that machines can’t apply the law and you can’t transfer responsibility for legal decisions to machines”, Neil Davison of the ICRC’s arms unit told AFP.
He highlighted the problematic nature of weapons systems, where there are major variables in terms of the timing or location of an attack – for example something that is deployed for multiple hours and programmed to strike whenever it detects an enemy target.
“Where you have a degree of unpredictability or uncertainty in what’s going to happen when you activate this weapons system then you are going to start to have problems for legal compliance”, he said.
The Convention on Certain Conventional Weapons (CCW) is a treaty that prohibits or restricts certain weapons considered to cause unnecessary or unjustifiable suffering. The 1995 protocol that banned blinding lasers is an example of a weapon being preemptively banned before it was acquired or used. A total of 125 nations are “high contracting” or state parties to the CCW, including all five permanent members of the UN Security Council.
• The UN Office for Disarmament Affairs collated background reading on LAWS
With reporting from AFP