Saturday, February 07, 2009

Wired for War

Hello Warriors, Weekend Warriors, and Concerned Peaceniks,

There is a growing usage of drones in military. The usage of up to 10,000 drones in the last decade or so by U.S. forces means our soldiers are facing more PTSD at home than if they are were in the regular forces, i.e. when they are sitting in las vegas and attacking others in Pakistan or Afghanitan.

The best pilots are those who trained on an X-box game.

More importantly, the issues of war are being turned on their heads.

People in the Middle East and elsewhere feel threatened by the drones but see that the USA is afraid of frighting them on their own turf. this will likely lead them to hate or despise the USA more in the 21st Century, i.e. as people playing video games do the killing rather than those who face people and their pains and threats head on.

I agree that we should say WE DON'T LIKE WHERE THIS IS HEADING.

Check out the full interview and beginning of debate.

IN 2009 and ONWARDS THE world needs to get laws on the books concering robots and war, i.e. who is responsible and for what.



Wired for War http://www.democracynow.org/2009/2/6/wired_for_war_the_robotics_revolution


AMY GOODMAN: We turn now to war. Three days after President Obama took office, an unmanned US Predator drone fired missiles at houses in Pakistan’s Administered Tribal Areas. Twenty-two people were reported killed, including three children. According to a tally by Reuters, the US has carried out thirty such drone attacks on alleged al-Qaeda targets inside Pakistan since last summer, killing some 250 people.


The Predator attacks highlight the US military’s increased use of unmanned aerial vehicles and other robotic devices on the battlefield. At the start of the Iraq war, the US had only a handful of drones in the air. Today the US has over 5,300 drones. They have been used in Iraq, in Afghanistan, Pakistan, Somalia and Yemen, as well as here at home. The Department of Homeland Security uses drones to patrol the US-Mexico border.


There has been a similar boom in the use of ground robotics. When US forces went into Iraq in 2003, they had zero robotic units on the ground. Now they have as many as 12,000. Some of the robots are used to dismantle landmines and roadside bombs, but a new generation of robots are designed to be fighting machines. One robot, known as SWORDS, can operate an M-16 rifle and a rocket launcher.


A new book has just come out examining how robots will change the ways wars are fought. It’s called Wired for War: The Robotics Revolution and Conflict in the 21st Century. The author, P.W. Singer, joins me here in the firehouse. He’s a senior fellow at the Brookings Institution, served as coordinator of the Obama campaign’s defense policy task force. He is also the author of Corporate Warriors and Children at War.


Welcome to Democracy Now!


P.W. SINGER: Thanks for having me.


AMY GOODMAN: Let’s start with Pakistan. Explain what these unmanned drones are.


P.W. SINGER: Well, you’re talking about systems that can be flown remotely. So, the planes, these Predator drones, are taking off from places in, for example, Afghanistan, but the pilots are physically sitting in bases in Nevada. And there, you have incredible capabilities. They can see from a great distance. They can stay in the air for twenty-four hours. And so, they’re very valuable in going after these insurgent and terrorist hide sites, which is in, you know, mountainous terrain, and it would be difficult to get US troops in.


But the flipside is that there’s a question of what’s the message that we think we are sending with these systems versus the message that’s being received on the ground, in terms of the broader war of ideas.


AMY GOODMAN: What do you mean?


P.W. SINGER: Well, so, I spent the last several years going around trying to meet with everyone engaged in this robotics revolution, everything from the scientists behind it to the science fiction authors influencing them, to the drone pilots, to the four-star generals, but also went out and interviewed people in the region.


And this question of messaging, one of the people that I met with was a senior Bush administration official, and he said, “The unmanning of war plays to our strength. The thing that scares people is our technology.” But that’s very different when you go meet with someone, for example, in Lebanon. One of the people that I met with for the book was an editor of a leading newspaper there. And he had to say that basically this shows that you are cowardly, that you are not man enough to come fight us. So a disconnect between message we think we’re sending versus message that’s being received.


Or another illustration of this would be, there was a popular music—one of the hit songs in Pakistan last year talked about how the Americans look at us like insects. Shows you how it’s permeating pop culture. So you have this balancing act that we’re facing between short-term goals and long-term goals.


AMY GOODMAN: P.W. Singer, the SWORDS, the CRAM, the PackBot—you talk about the robots taking on the “Three D’s.”


P.W. SINGER: The “Three D’s” are roles that are dull, dirty or dangerous, and they’re basically areas where they’ve found robotics have been useful. Dull would be, you know, you can’t keep your eyes open thirty hours straight; a robot can. So it can monitor empty desert for just in case something happens. Dirty is the environment. It can operate not only in, you know, chemical or biological but also in dust storms or at night. We can’t see at night. Things like that. And then, of course, dangerous is you can send out a robot to do things that you wouldn’t send a soldier to do. And the sort of joke of it is that, you know, when it comes to war, you are the weakest link.


Now, the problem is, what are the implications of that for our democracy? So, for example, if you are sending less and less Americans into harm’s way, does it make you more cavalier about the use of force? And one of the people that was fascinating that I interviewed was a former assistant secretary of Defense for Ronald Reagan, who actually had this to say about these systems. He worried that they would create more marketization of war, as he put it. We might have more shock and awe talk to defray discussion of the true costs of war.


AMY GOODMAN: But that is a very serious issue, when—I mean, the time when wars are ended is when one side cannot take the number of casualties, that, for example, if your soldiers that are fighting are being killed. But if they’re robots…


P.W. SINGER: I mean, the concern I have is that it takes certain trends that are already in play in our body politic. We don’t have declarations of war anymore. We don’t have a draft. We don’t buy war bonds anymore. We don’t pay higher taxes for war. And now you have the fact that you may be sending more and more machines instead people. And so, you may be taking the already lowering bars to war and dropping them to the ground.


And then there’s another part of this, of course, is it changes the public’s relationship with war. It’s not just that the public is becoming de-linked, but remember, these machines record everything that they see. And so, we have the rise of what I call YouTube war. That is, you can go on YouTube right now and download video clips of combat footage, much of it from these drones. And so, in some ways, you could say that’s a good thing. The home front and war front are finally connected. You can see what’s going on. But we all know this is taking place in sort of our weird, strange world, and these video clips have become a form of entertainment for people. The soldiers call it war porn.


AMY GOODMAN: P.W. Singer, you write about robots making mistakes, like in South Africa in 2007, a software glitch in a robotic gun; in ’88, a semi-automatic defense system of the USS Vincennes accidentally shoots down an Iranian passenger plane, killing all 290 people on board, including sixty-six children.


P.W. SINGER: The challenge here is that while we are gaining incredible capabilities with these systems, it doesn’t end the human mistakes and dilemmas behind them. Another way of putting it is, a lot of people are familiar with Moore’s Law, the idea that you can pack in more and more computing power, such that they double in their power every two years. It’s the reason why the Pentagon in 1960 had the amount of computing power that you and I can get from a Hallmark card right now. Now, Moore’s Law is certainly operative. These systems are getting better and better. But Murphy’s Law is still in place. And so, you get what robot executives call these “oops” moments with robots, when things don’t work out the way you want. And it’s just like, you know, our laptop computers crash. Well, imagine your laptop computer crashing with an M-16 rifle.


AMY GOODMAN: How does international law address robots in war?


P.W. SINGER: The problem right now is we don’t have a good answer to that question. Some of the people that I met with for the book were at both the International Red Cross and then also at Human Rights Watch. And there’s two sort of interesting things that came out of that.


At the Red Cross, they basically said, “There’s so much going on in the world that’s bad, we don’t have time to focus on something like this, something that’s like this kind of science fiction.” And that’s a completely justifiable thing to say. I mean, there’s a lot of bad things going on in the world, you know, Darfur, for example. But you could have said the exact same thing back in, say, 1942, where there was lots of bad things going on then, but it doesn’t mean that we shouldn’t have, if we had had the chance, launched a discussion about atomic weapons back then. And unlike with atomic weapons, robotics isn’t being worked on in secret desert labs that no one knows about; it’s in our face. It’s being worked on in, you know, suburban factories we all know about. We can see the video clips of it.


The Human Rights Watch visit was sort of interesting, but both funny, because while I was there, two of their lead people got in an argument over whether the Geneva Conventions were the best guideline or the Star Trek prime directive. And it kind of points to that we’re grasping at straws right now when it comes to regulating these machines. And this is—again, as you pointed out, this isn’t science fiction. We have 12,000 of them on the ground right now.


AMY GOODMAN: What happens if a robot commits a massacre?


P.W. SINGER: It’s a great question. You know, who do you hold responsible? Do you hold responsible the human operator? Do you hold responsible the commander who authorized them there? Do you hold responsible the software engineer who wrote it wrong? And how do you hold them accountable? We don’t have good answers.


And what was funny is, one person that I interviewed was a Pentagon robotic scientist. And he said, “You know what? You’re wrong. There’s no social or ethical or legal dimensions with robotics in war that we have to figure out.” He said, “That is, unless the machine kills the wrong person repeatedly.” Quote, “Then it’s just a product recall issue.” That isn’t the way I think we should be looking at the social, ethical and legal dimensions of all of this. And that’s why we need to launch a discussion about it. Otherwise, we’re going to make the very same mistake that a past generation did with atomic bombs, you know, not talking about them until Pandora’s box is already opened.


AMY GOODMAN: P.W. Singer, who makes these drones? What are the corporations involved?


P.W. SINGER: It is a wide industry that’s growing, and it includes both the large traditional defense contractors—and not just in the US. There’s forty-three other countries working on military robotics right now.


But it also includes companies that people may think of in a different way. The book opens with a visit to iRobot. IRobot is—it’s named after the Isaac Asimov novel and the—you know, the Will Smith movie. And they make the PackBot, which is one of the most popular robots in Iraq. It’s about the size of a lawnmower and goes out and defuses bombs, and now it’s being armed. But iRobot also makes the Roomba, the robot vacuum cleaner. And so, I joke it’s, you know, one of the few companies in the world that sells both to the Pentagon and also to Linens ‘n Things.


AMY GOODMAN: Who else? What other companies?


P.W. SINGER: Well, right down the road from iRobot is a company called Foster-Miller. And Foster-Miller is interesting because, just like it, it was launched by three MIT engineers. Now, Foster-Miller is a fascinating company, because iRobot started out in robotics and moved into the defense world; Foster-Miller is a defense company that moved into the robotics world. And it’s actually a subsidiary of QinetiQ, which is also linked to, you know, the company that everybody loves to speculate about, the Carlyle Group.


AMY GOODMAN: Say more about the Carlyle Group.


P.W. SINGER: Oh, your listeners are probably more familiar with the Carlyle Group than I am. It’s, you know, one of the large investment firms based in D.C. that has sort of a who’s who of people on its board, and it’s the company that conspiracy theorists love to hate.


AMY GOODMAN: P.W. Singer, you write about—in a previous book, Corporate Warriors, you write about mercenaries.


P.W. SINGER: Mm-hmm.


AMY GOODMAN: Do you see a link between mercenaries and robots, having to do with deniability, not being included in the body count, and other issues?


P.W. SINGER: That’s a really great question. And a lot of people ask, you know, “How do you—how did you—you first did a book on private contractors, then a book on child soldiers, and now one on robots. You know, what’s the thread that links these?” And there’s two.


One is that the sands are shifting underneath us in warfare, and we’re in denial about it. We have an assumption of who fights wars, and it’s usually a man in a uniform, and that means, oh, well, he’s part of a military, and he must be fighting on behalf of a government, and it must be politics and patriotism. But look at, for example, the rise of the private military industry. That is someone who’s fighting not on behalf of a government, but a private corporation. Profit motive is involved. Child soldier is another breakdown. Ten percent of the combatants in the world today are children. You know, war is not an adult game anymore. And then, with robotics, it’s sort of the ultimate breakdown of humankind’s 5,000-year-old monopoly on war.


But the second thing, as you raised, is an important point, that the use of contractors, as well as the movement towards machines, is, in a sense, a sort of outsourcing of responsibility, an outsourcing of risk, trying to avoid some of the political costs that go to war. You know, I’m often asked, “Well, does this save us money?” Well, that’s not the right question on either the contractor issue nor the digital issue. We aren’t using these systems because simply they save money. It’s because they allow us to avoid certain political costs.


AMY GOODMAN: Are these robotic technologies available to the mercenary companies like Blackwater?


P.W. SINGER: Yeah. There’s a section in the book. I call it “Soldiers of Fortran,” after the old software program Fortran. And there’s a great story in it, which actually encapsulates some of the weird ways this is going, where a group of college students fundraised money to do something about Darfur, and they ended up actually raising about a half-million dollars. It went well beyond their wildest dreams.


And so, then they explored whether they could hire their own private military company. And they called—you know, sent messages out via email, and a number of private military companies called them back, to their dorm room, and one of them actually offered to lease them some drones and—to use in Darfur. And the kids were talking about this. They didn’t imagine it would take off like this, but it did. Now, fortunately, some other people spoke with them and said, “Hey, this is really not the best use of the money that you fundraised.”


But it points to how these systems, they’re not just accessible to militaries. As you noted, they’re being used by DHS, by police agencies. And, of course, many of them use commercial technology. For $1,000, you could do it yourself. You can build the version of a Raven drone. And so—


AMY GOODMAN: How?


P.W. SINGER: Well, it’s a little more complex than we can go into on the show, but basically there’s a do-it-yourself kit for building very similar to a Raven drone. And the point here is that you have—


AMY GOODMAN: And the drone shoots people?


P.W. SINGER: That drone isn’t armed. It’s the ability to sort of toss it in the air, and it could go off and, a mile away, show you what’s on the other side of that hill. Now, of course, you could probably jury-rig it yourself to do bad things.


What I’m getting at is that just as software has gone open source, so has warfare. And these systems are not something that requires a massive industrial complex to build, like an aircraft carrier would or like an atomic weapon would. They use commercial technology. And so, that means that they can be used both for good and ill, by actors that have both good intent and bad intent. And the ethical question that we need to think about, you know, when we talk about robots and ethics—most people just want to talk about, you know, Asimov’s laws—well, we also need to think about the ethics of the people behind the robots.


AMY GOODMAN: Well, let’s talk about specifically the experience of the soldiers who are now pushing buttons. They, themselves, their lives, are not at risk. They’re not experiencing the person at the other end.


P.W. SINGER: Yeah, when I use the term "robotics revolution,” I need to be clear here. You know, I’m not talking about a revolution where, you know, your Roomba vacuum cleaner is going to sneak up and ambush you. We’re talking about a revolution in the way wars are fought and who fights them. And this aspect of distance is one of the big ones. It changes the very meaning of going to war.


You know, my grandfather served in the Pacific fleet in World War II. When he went to war, he went to a place where danger took place, and the family didn’t know if he was ever coming back. And that’s very different than the experience of, for example, a Predator drone pilot that I met with who described that basically his experience of fighting in the Iraq war was getting in his Toyota Corolla, driving to work—he’s doing this in Nevada—driving into work for twelve hours, he puts missiles on targets, then gets back in the Toyota, commutes back home, and within twenty minutes he’s talking to his son at the dinner table.


AMY GOODMAN: When you say “puts missiles on targets,” you mean bombs.


P.W. SINGER: Hellfire missiles. You know, he’s basically—he is engaging—


AMY GOODMAN: He attacks, I mean.


P.W. SINGER: He is engaging in combat. But he’s doing it from 7,000 miles away. And then, at the end of the day, he goes to a PTA meeting or takes his kid to soccer practice. And so, it’s a whole new experience of war, which is actually creating a new concept of a warrior.


AMY GOODMAN: But you write, interestingly, that these soldiers who are engaging in war remotely actually suffer greater rates of PTSD, post-traumatic stress disorder.


P.W. SINGER: Yeah, it actually creates a huge psychological disconnect for them and challenges for them on a lot of different levels. And one of the people I met with was a commander of one of these squadrons, and he said it was actually tougher leading an unmanned squadron based back home than it was leading a squadron based in the Middle East, deployed.


And it was a couple things. One is the distancing. You’re at home, but you’re watching these scenes of violence. You’re watching both Americans die in front of you and not able to do anything about it, or you’re engaging in kills on enemies, and you’re seeing it. And then you leave the room, and it’s just like nothing else happened. Your wife is asking you about, you know, why didn’t—you were late for a PTA meeting. You also have the fact that war is 24/7 now. And so, you’re at home, but your time schedule is off because of—you know, it’s 7,000 miles away.


There’s a lot of psychological challenges that we’re just trying to figure out. And there’s also a little bit of denial, in terms of the military support, because these guys don’t want to seem like they’re soft, and they don’t want to say, you know, “I’m suffering worse than the guys in the field.” And they’re not, in many cases, getting as much support as they should be.


AMY GOODMAN: What about the use of drones here at home, for example, on the border?


P.W. SINGER: It’s a fascinating thing, because it raises this question of who should be allowed to have these systems and how should they use them. So, for example, DHS saw the success that the military was having with drones and said, you know, “We want our own.” They bought them using mainly counterterrorism money, but, of course, we know they’re being used for a different kind of infiltrator across the border than al-Qaeda agents. They’ve been used to deal with immigrants and some drug smuggling.


AMY GOODMAN: Are they armed?


P.W. SINGER: No, those ones are not armed, but these are issues that you have to think about, moving forward. Who should be allowed to have armed systems or not? Should police be allowed to have it? You know, the LA Police Department is exploring purchasing a drone to put above a high crime neighborhood and just have it fly above and document everything. Is that something we think is OK for our rights? What about—remember, drones aren’t just big large planes. Some of them are as small as—they could fit on your fingertip, and they could climb up to a windowsill and peer inside.


AMY GOODMAN: The relationship with games? You write that the best pilot is an eighteen-year-old kid who trained on an [Xbox] video game?


P.W. SINGER: Pretty much. It’s a sort of fascinating story of—


AMY GOODMAN: Xbox.


P.W. SINGER: Yeah. He was actually a high school dropout who wanted to join the military to make his father proud. He wanted to be a helicopter mechanic. And they said, “Well, you failed your high school English course, so you’re not qualified to be a mechanic. But would you like to be a drone pilot?” And he said, “Sure.” And it turned out, because of playing on video games, he was already good at it. He was naturally trained up. And he turned out to be so good that they brought him back from Iraq and made him an instructor in the training academy, even though he’s an enlisted man and he’s still—he was nineteen.


And the fascinating thing is, you go, “That’s an interesting story.” You tell that story to someone in the Air Force, like an F-15 pilot, and they go, “I do not like where this is headed. You know, I’ve got a college education. The military spent $5 million training me up. And you’re telling me that this kid, this nineteen-year-old—and, oh, by the way, he’s in the Army—is doing more than I am?” And that’s the reality of it.


AMY GOODMAN: ABC News says you wrote the campaign paper for Obama on robots?


P.W. SINGER: I served as defense policy coordinator for the task force that advised him on defense policy, so not just robotics, but across the board. And it was an amazing experience. And one of the things that I was proud of is how we brought together a really diverse set of advisers and experts, you know, people—everything from retired generals to young veterans to academic experts, a whole mix that gathered around helping to advise who is now the President and what our policies should be.


AMY GOODMAN: I want to thank you very much for being with us, P.W. Singer, senior fellow at the Brookings Institution. His new book is Wired for War: The Robotics Revolution and Conflict in the 21st Century.

Labels:

0 Comments:

Post a Comment

<< Home