background preloader

Killer Robots: Autonomous Weapons Authorized for Lethal Force

Facebook Twitter

Confessions of an American Drone Operator. From the darkness of a box in the Nevada desert, he watched as three men trudged down a dirt road in Afghanistan.

Confessions of an American Drone Operator

The box was kept cold—precisely sixty-eight degrees—and the only light inside came from the glow of monitors. The air smelled spectrally of stale sweat and cigarette smoke. On his console, the image showed the midwinter landscape of eastern Afghanistan’s Kunar Province—a palette of browns and grays, fields cut to stubble, dark forests climbing the rocky foothills of the Hindu Kush. He zoomed the camera in on the suspected insurgents, each dressed in traditional shalwar kameez, long shirts and baggy pants.

He knew nothing else about them: not their names, not their thoughts, not the thousand mundane and profound details of their lives. He was told that they were carrying rifles on their shoulders, but for all he knew, they were shepherd’s staffs. It was quiet in the dark, cold box in the desert, except for the low hum of machines. That was Brandon Bryant’s first shot. Q&A on Fully Autonomous Weapons. Advances in artificial intelligence (AI) and other technologies will soon make possible the development of fully autonomous weapons, which would revolutionize the way wars are fought.

Q&A on Fully Autonomous Weapons

These weapons, unlike the current generation of armed drones, would be able to select and engage targets without human intervention. Military officials in the United States and other technologically advanced countries generally say that they prefer to see humans retain some level of supervision over decisions to use lethal force, and the US Defense Department has issued a policy directive embracing that principle for the time being. But the temptation will grow to acquire fully autonomous weapons, also known as “lethal autonomous robotics” or “killer robots.” If one nation acquires these weapons, others may feel they have to follow suit to avoid falling behind in a robotic arms race.

Brum27 comments on Why Self-Driving Cars Must Be Programmed to Kill. Is War Becoming More Humane? The Inevitabilities of Killer Robots. Illustration: Shaye Anderson In October 2012, nine civil society organizations met in New York and agreed to work together to create the Campaign to Stop Killer Robots.

The Inevitabilities of Killer Robots

Since its launch six months later in London the campaign has seen increased public awareness, strong media coverage, and the remarkably fast—in diplomatic terms—commencement of diplomatic talks to discuss questions raised by these weapons. These nascent efforts provide a counterbalance to the obvious push for the development, production, and ultimate use of fully-autonomous weapons systems that continues unabated. As thousands of noted artificial intelligence and robotics experts recently stated in an open letter in opposition to killer robots, weapons systems capable of targeting and killing human beings on their own will be on the battlefield in a matter of years not decades.

Piloted Drones

We should not dismiss the dangers of 'killer robots' so quickly. In an open letter I helped publish on July 28 – which has now been signed by more than 2,700 artificial intelligence (AI) and robotics researchers from around the world – we stated that “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control”.

We should not dismiss the dangers of 'killer robots' so quickly

A few days later, philosopher Jai Galliott challenged the notion of a ban, recommending instead that we welcome offensive autonomous weapons – often called “killer robots” – rather than ban them. I was pleased to read Jai’s recommendation, even if he calls the open letter I helped instigate “misguided” and “reckless”, and even if I disagree with him profoundly. This is a complex and multi-faceted problem, and it is worth considering his arguments in detail as they bring several important issues into focus. Robot Soldiers Are Coming! UN: Hold International Talks on ‘Killer Robots’ (New York) – All governments should support international talks to address the threat posed by fully autonomous robotic weapons, Human Rights Watch said today. Human Rights Watch and the Harvard Law School International Human Rights Clinic on October 21, 2013, issued a question-and-answer document about the legal problems posed by these weapons. Representatives from the Campaign to Stop Killer Robots, including Human Rights Watch, will present their concerns about fully autonomous weapons at a United Nations event in New York on October 21.

“Urgent international action is needed or killer robots may evolve from a science fiction nightmare to a deadly reality,” said Steve Goose, arms director at Human Rights Watch. “The US and every other country should support holding international talks aimed at ensuring that humans will retain control over decisions to target and use force against other humans.” Q&A on Fully Autonomous Weapons.