[ad_1]
Mc3 Bryan Mai / U.S. Navy
Killer robots, and whether to ban them, will be on the menu for international talks next year at the United Nations. The decision was approved on Friday by a panel that included the US and China.
The weapons under consideration, prosaically known as “lethal autonomous weapons systems,” include robots that can find a target and fire at will without a person making that decision.
Sixty-five non-profit groups united under the “Campaign to Stop Killer Robots” cause have been calling for a ban on such weapons for over three years.
The UN agreement represents a first-time handoff of the debate to governments, which include some countries who have spent billions of dollars developing such weapons.
The decision was made at the UN Convention on Certain Conventional Weapons, the group that bans or regulates the use of very new or very lethal weapons. In the past, the CCW has resolved to ban landmines, incendiaries like napalm, and blinding lasers.
Following protocol, nations agreed to form an expert group who will next year debate how killer robots could be regulated.
“It puts the power into their hand as governments,” Mary Wareham, advocacy director of the Arms Division at Human Rights Watch and coordinator of the Campaign To Stop Killer Robots, told BuzzFeed News. “Until now, it’s been a series of PowerPoints by experts and it’s governments asking them questions. And this is where they go home and go do their homework.”
Earlier this month, nine members of Congress led by Massachusetts Rep. James McGovern addressed a letter to the Secretaries of State and Defense supporting continuing the expert talks, and recommending that the US delegation support a four-week meeting next year.
“As this military technology continues to advance, we need to take a hard look at the risks it poses,” McGovern said in a statement emailed to BuzzFeed News.
Mc2 Antonio P. Turretto Ramos / U.S. Navy
Experts agree that the campaign has had swift success calling attention of the international community to a largely futuristic technology.
“If you look at the last review conference in 2011, not a single state party even mentioned autonomous weapons,” Southern Methodist University law professor Chris Jenks who spent 20 years with the US Army, then worked on international law at the Pentagon, told BuzzFeed News. “Here we are 5 years later, and it is the focus.”
But despite this success, Jenks and others believe that an outright ban on killer robots is unlikely to be the result of next year’s talks.
“I think it’s going to be a really slow process and we’re not likely to see the military powerhouses signing on to any treaty ban,” Rebecca Crootof, a lecturer at Yale Law School, told BuzzFeed News.
So far, 19 countries have called for a ban (three of those spoke at this week’s deliberations), but Crootof pointed out that the list did not include many countries — like the US, China and Russia — that have already spent billions building those weapons. The US has focused its efforts on self-driving trucks and helicopters, and drone swarms that coordinate flight among themselves.
Crootof also argues that a more reasonable approach would be to regulate how they are used, as the expert discussions could do, because of how unique this class of weapon is.
“Autonomous weapons systems are just fundamentally unlike anything else we’ve regulated before,” she said.
Others are cautiously optimistic that automated decision making, ironically, could lead to more humane warfare.
“If weapon systems can be developed which are more discriminating and more accurate than current ones, not only should they be developed, I think there is a moral imperative to do so,” Jenks said.
“I never encountered someone who felt less bad about being shot because they were shot by a human — there is being shot and there is not being shot.”
131109-N-ZZ999-176
Mcsn Hilkowski / U.S. Navy
LINK: How To Save Mankind From A New Breed Of Killer Robot
LINK: The U.S. Military Is Betting On “Smart” Drones — Lots And Lots Of Them
LINK: Drones Could Become Flying Peeping Toms, Privacy Experts Warn
[ad_2]