Robots. AI. AI researcher says amoral robots pose a danger to humanity. With robots becoming increasingly powerful, intelligent and autonomous, a scientist at Rensselaer Polytechnic Institute says it's time to start making sure they know the difference between good and evil.
"I'm worried about both whether it's people making machines do evil things or the machines doing evil things on their own," said Selmer Bringsjord, professor of cognitive science, computer science and logic and philosophy at RPI in Troy, N.Y. Confessions of an American Drone Operator. From the darkness of a box in the Nevada desert, he watched as three men trudged down a dirt road in Afghanistan.
The box was kept cold—precisely sixty-eight degrees—and the only light inside came from the glow of monitors. The air smelled spectrally of stale sweat and cigarette smoke. On his console, the image showed the midwinter landscape of eastern Afghanistan’s Kunar Province—a palette of browns and grays, fields cut to stubble, dark forests climbing the rocky foothills of the Hindu Kush. He zoomed the camera in on the suspected insurgents, each dressed in traditional shalwar kameez, long shirts and baggy pants.
He knew nothing else about them: not their names, not their thoughts, not the thousand mundane and profound details of their lives. He was told that they were carrying rifles on their shoulders, but for all he knew, they were shepherd’s staffs. Cybernetics Bionics Robotics. Daniel Suarez: The kill decision shouldn't belong to a robot. Desert Wolf unveils riot control drone. Introducing Spot. Invited Speakers - DARS 2016. Naming the Dead’ project will name and number people killed by drone airstrikes to challenge CIA claims of no civilian deaths.
By George ChidiSaturday, September 21, 2013 19:31 EDT A project launched on Monday aims to record properly the names and numbers of people who are killed by US drone airstrikes in Pakistan.
The website, “Naming the Dead”, is an initiative by the Bureau of Investigative Journalism (TBIJ), a not-for-profit organisation that has won awards for its work exposing some of the realities of the covert drone wars that are being run by the US and UK militaries in Afghanistan, Pakistan, Yemen and Somalia. It aims to keep as comprehensive a record as possible of the victims of drone airstrikes in Pakistan, after research revealed that only one in five of the victims of the 370 airstrikes that have taken place have been identified outside their own, often remote, communities. The objective, said TBIJ deputy editor Rachel Oldroyd, is to take these deaths out of obscurity and make it easier to test statements about the nature and use of drones. Guardian.co.uk © Guardian News and Media 2013. Robot Soldiers Are Coming!
Robotics. Robotics and Autonomous Systems. Robots. Robots Able to Pick Peppers, Test Soil, and Prune Plants Aim To Replace Farm Workers. At the turn of the last century, nearly half of the American workforce was dedicated to agriculture.
Silicone sensors give robot fingers the human touch. Robot arms that can feel as well as grip are on the way, but developing a bionic nervous system is expensive and complicated.
Is there an easier, cheaper way to bring enhanced dexterity to an android's digits? One Hungarian start-up says there is. OptoForce is the Budapest-based firm behind a new kind of sensor which uses a blend of silicone and infrared light to dynamically measure the changing force exerted on a robot's fingers. The results are good enough for a 3-axis claw fitted with the sensors to keep a secure grip while an empty cup is gradually filled with liquid.
When pressure is applied to the surface of one of the sensors, their pliable half-sphere shape is slightly deformed, an action which instantly changes this distribution of infrared light from the LED inside. Tech & Robots. The Inevitabilities of Killer Robots. Illustration: Shaye Anderson In October 2012, nine civil society organizations met in New York and agreed to work together to create the Campaign to Stop Killer Robots.
Since its launch six months later in London the campaign has seen increased public awareness, strong media coverage, and the remarkably fast—in diplomatic terms—commencement of diplomatic talks to discuss questions raised by these weapons. These nascent efforts provide a counterbalance to the obvious push for the development, production, and ultimate use of fully-autonomous weapons systems that continues unabated. We should not dismiss the dangers of 'killer robots' so quickly. In an open letter I helped publish on July 28 – which has now been signed by more than 2,700 artificial intelligence (AI) and robotics researchers from around the world – we stated that “starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control”.
A few days later, philosopher Jai Galliott challenged the notion of a ban, recommending instead that we welcome offensive autonomous weapons – often called “killer robots” – rather than ban them. I was pleased to read Jai’s recommendation, even if he calls the open letter I helped instigate “misguided” and “reckless”, and even if I disagree with him profoundly. This is a complex and multi-faceted problem, and it is worth considering his arguments in detail as they bring several important issues into focus. Four points Jai puts forward four arguments why a ban is not needed: Let’s consider the claims in turn. The final argument claims UN bans are virtually useless.