Weapons photo
courtesy iRobot
SHARE

PackBots roam the streets of Iraq defusing bombs. Remote-controlled SWORDS robots shoot rifles and rocket launchers with deadly accuracy. Predator drones piloted by soldiers in Nevada drop missiles on Iraq and Afghanistan. The Wasp robot flies over neighborhoods full of insurgents, recording what’s below with a camera as small as a peanut.

If this sounds like a futuristic science fiction story, it’s not. As of today, over 12,000 robots are working in Iraq, up from zero five years ago.

The military is driving the cutting edge of the robotics industry, so forget about Isaac Asimov’s Three Laws as laid out in his seminal science fiction book I, Robot: a robot may not injure a human being, a robot must obey orders given to it by human beings, a robot must protect its own existence. We’re already living in a strange new world.

P.W. Singer, a senior fellow at the Brookings Institute, has written Wired for War, in which he’s interviewed a motley crew of people involved with the robotics industry: CEOs of robotics companies, 19-year-old drone pilots, four-star generals, refusenik roboticists who won’t build for the military, people in the Middle East on the other end of the bullets and missiles, and science fiction authors who consult for the Pentagon.

Robots do see better than humans, shoot straighter and faster than us, and never get tired, but is it only a matter of time until, as any reader of Western science fiction knows, the robot “wises up and then rises up,” as Singer says?

I spoke to Singer about what he found out in his journey into the heart of the robotics industry, and what’s to come.

How much did you know about robots before you started on this project?

I could claim I knew a lot in that I grew up playing with Star Wars action figures and sleeping in Battlestar Galactica bedsheets, but the reality is that I’m not an engineer, I’m a social scientist, so the chapter “Robotics for Dummies” was about how I was a dummy, learning everything I could about robotics so I could explain them to a layman, where the field is and where it’s headed. There’s a wide ignorance — I mean that in the true meaning of the term, not the slur meaning — about the exciting and fascinating and sometimes scary things going on with robotics today. It’s seen as mere science fiction but it’s not. And it’s a lot further ahead than most people have a sense of.

Could you talk about the connection between science fiction and what’s actually being designed?

It was startling how open people were about talking about science fiction and their use in the real world. The Marine colonel talking about, ‘Oh, yeah, I got the idea for building that system from The Empire Strikes Back with my kids.’ The people at the Air Force research lab talking about how, “Oh yes, we decided to call it the Phaser because we knew it was more likely to get funding if it had a sci-fi sounding name.”

[Science Fiction Museum director] Donna Shirley says, “Science fiction isn’t so much about how to build the bomb but what happens if.” If you build the bomb you get Dr. Strangelove, and that’s the part that interests me. That’s really what the book is about. All of the dilemmas that come out of having science fiction come true, having science fiction play out on our modern battlefields.

I went to Human Rights Watch and was asking them about accountability coming out of drone strikes in places like Afghanistan and Pakistan and two of the senior leaders there get in an argument in front of me as to which legal system we should turn to to get the answers, to who do we hold accountable when the drone kills the wrong person, and one says, “the Geneva Convention,” and the other says, “No, no, no, it’s the Star Trek Prime Directive.”

And it shows to me: one, the influence of science fiction in the oddest of places, but two, it also shows the challenges when you enter into this sci-fi-turned-real-world, where suddenly we’re really grasping at straws because our current laws just haven’t caught up yet.

There are a series of open questions right now: Where is the code of ethics in the robotics field? What gets built and what doesn’t get built? Where’s the answer to the question of who gets to use these technologies? Who doesn’t get them? They’re the kind of questions that you used to only talk about in science fiction conventions, but these are very real, live questions right now.

This spy drone can take off by itself, fly 3,000 miles, spend a day spying on an area the size of the state of Maine, fly back 3,000 miles, and then land itself. <em>-- P.W. Singer</em>

Global Hawk Drone

This spy drone can take off by itself, fly 3,000 miles, spend a day spying on an area the size of the state of Maine, fly back 3,000 miles, and then land itself. — P.W. Singer

How have these robots changed the experience of war?

The very meaning of the term “going to war” has changed in our lifetime. Whether we were talking about the ancient Greeks going to war against Troy in the Iliad, to my grandfather’s experience in World War II going to war against the Japanese in the Pacific, that phrase has meant the same thing over the last 5,000 years. It’s meant going to a place where there was such danger that you might never return.

You have now the experience of, for example, that Predator drone pilot who appears in the book, who says, “you’re ‘going to war’ for twelve hours, you’re putting missiles on enemy targets, you’re killing enemy combatants, and then you get back in your car and drive home and 20 minutes after being ‘at war’ you’re sitting at the dinner table talking to your kids about their homework.”

We’re starting to see ripple effects on our own politics. As that former Secretary of Defense puts it in the book, “I like these systems because they save American lives but I also worry about more marketization of war, more ‘shock and awe’ talk, to defray discussion of the cost. People are more likely to support the use of force if they view it as costless.” We may be taking those bars to war that we were already lowering and dropping them to the ground.

And what is it like for the Iraqis or the Afghanis on the receiving end of this technology?

The leading news editor of Lebanon [Rami Khouri], who was actually saying this [to me] while a drone was flying above him at the time, and he says, “It’s just another sign of the cold-hearted, cruel Israelis and Americans who are also cowards because they send out machines to fight us. They don’t want to fight us like real men. But they’re afraid to fight, so we just have to kill a few of their soldiers to defeat them.” That is just a graphic illustration of an absolute disconnect in the war of ideas between the message we think we’re sending versus the message being received. In Pakistan, one of the most popular songs last year [“Chacha Wardi Lahnda Kyo Nahen” or “Uncle, Lose the Uniform, Why Don’t You?”] talked about how Americans don’t fight with honor and they just look at Pakistanis the way they look at insects. That ain’t the kind of messaging you want to be sending out.

How much autonomy do you think robots should have? What direction are we going in versus what direction should we be going in?

Every time you ask about this issue of armed and autonomous robots, people always use the phraseology, “No, no, no. We’ll always have man in the loop.” The “loop” phraseology is like a mantra everyone has to chant. And yet, it’s utter B.S.. We have systems right now that we’re already granting massive amounts of autonomy to.

There’s the example of the CRAM, the Counter Rocket Artillery Mortar, the little R2-D2-like system in Baghdad that automatically shoots down incoming rockets because they’re coming in too quickly for humans to respond to. You’ve got an incoming rocket and the human can barely get to, “Oh, sh—” and it’s too late. Yeah, man’s in the loop, we turn it on and off, but we don’t have the reaction time to decide what it shoots at and what it doesn’t.

There’s an incredible array of excuses we come up for giving the system more autonomy. It’s everything from, ‘Things are happening quickly in a war. We’ll not allow it to shoot first but we’ll give it shoot-back ability.’ Or ‘We’ll design systems that don’t shoot at people, they just can shoot at other weapon systems. They can’t shoot at the people in the tank, they can shoot at tanks.’

Each one takes us further and further down the slippery slope that we say we’ll never, ever cross. Guess what? We are directly researching armed, autonomous systems. The funniest illustration of that is the title of one of the Pentagon research projects on it, which is actually entitled, “Taking Man Out of the Loop.”