January 19, 2009

Towards a 'largely robotic' battlefield

A new book will hit store shelves later this week that will be of interest to those concerned about the ongoing roboticization and de-humanizing of military technology. The book, Wired for War: The Robotics Revolution and Conflict in the 21st Century, is authored by P. W. Singer, the director of the 21st Century Defense Initiative at the Brookings Institution. He has also published Children at War (2005) and Corporate Warriors: The Rise of the Privatized Military Industry (2003).

In a recent Wilson Quarterly article, Singer makes the claim that Pentagon planners are already provisioning for battlefields that will be, as they put it, "largely robotic." It's no secret that the U.S. military is developing a variety of unmanned weapons and seemingly futuristic technologies -- everything from automated machine guns and robotic ­stretcher ­bearers to tiny but lethal robots the size of insects.

As these weapons gain more and more autonomy, deeper questions arise. Singer poses difficult questions: "Can the new armaments reliably separate friend from foe? What laws and ethical codes apply? What are we saying when we send out unmanned ma­chines to fight for us? What is the “message” that those on the other side receive?" And ultimately, asks Singer, how will we remain masters of weapons that are immeasurably faster and more "intelligent" than we are?

Proxy killing

A fundamental problem as Singer sees it is the ease with which killing can now take place. He cites the example of the Predator, an unmanned aerial vehicle (UAVs). This propeller-­powered drone is 27 feet in length, can spend up to 24 hours in the air and flies at a height of 26,000 feet. Predators are flown by "reach-back" or "remote-split" operators -- military personnel who are 7,500 miles away and who fly the planes via satellite from a set of converted ­single-­wide trailers located mostly at Nellis and Creech Air Force bases in Nevada.

This type of operation has created a rather novel situation where "pilots" experience the psychological disconnect of being "at war" while dealing with their daily domestic routines. Singer notes the words of one Predator pilot, “You see Americans killed in front of your eyes and then have to go to a PTA meeting.” Says another, “You are going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants, and then you get in the car, drive home, and within 20 minutes you are sitting at the dinner table talking to your kids about their homework."

These days there are more than 5,300 drones in the U.S. military’s total arsenal and not a single mission happens without them. The Pentagon predicts future conflicts involving tens of thousands.

Better than humans

The appeal of robots is obvious. They don't need to be returned home in body bags after they've shot down. Moreover, robots don't come with typical human frailties and foibles. "They don’t get hungry," says Gordon Johnson of the Pentagon’s Joint Forces Command. "They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes." Johnson's comments sound eerily like the script from a Terminator movie.

And as these technologies improve, human capabilities are being increasingly pushed to their limits. Today's F-16 fighter jet can maneuver so fast and hard that its pilots black out. As a DARPA official has noted, "the human is becoming the weakest link in defense systems." Moving forward, autonomous weaponry will be increasingly used in place of humans. Eventually it will be robot versus robot -- especially when the theater of operations starts to function at technologic speed. The Pentagon is aware of this possibility, noting that "As the loop gets shorter and shorter, there won’t be any time in it for humans."

Failure to override

The inevitable question arises: Who will control robots that work autonomously and at suprahuman 'technologic' speed? There are already disturbing examples of 'failure to override' incidences -- when machines function outside of human control and can't be shut down. Today's Navy ships use the Aegis computer system which enters into "casualty" mode when all the humans onboard are dead. In this situation the guns go into a kind of berserker mode and the computer does its best to ensure that the ship doesn't get hit. As Singer notes, "Humans can override the Aegis system in any of its modes, but experience shows that this capability is often beside the point, since people hesitate to use this power. Sometimes the consequences are ­tragic."

Part of the problem is that humans are starting to give intelligent systems the benefit of doubt. In many cases the human power "in the loop" was actually only veto power -- but even that is a power that military personnel are often unwilling to use against the quicker (and what they viewed as superior) judgment of a ­computer.

The next step in this trend is to give robots the ability to fire back on their own. As Johnson notes:
Anyone who would shoot at our forces would die. Before he can drop that weapon and run, he’s probably already dead. Well now, these cowards in Baghdad would have to pay with blood and guts every time they shot at one of our folks. The costs of poker went up significantly. The enemy, are they going to give up blood and guts to kill machines? I’m guessing not.
Johnson in particular views this as not only logical but quite attractive.

Removing the human factor


Retired Army colonel Thomas Adams believes that the speed, confusion, and information overload of modern-day war will soon move the whole process outside "human space." He predicts that future weapons will be too fast, too small and too numerous; they will create an environment that's simply too complex for humans to direct.

The Joint Forces Command is very aware of this possibility, noting that autonomous robots on the battlefield will be the norm within 20 years. Military and robotics developers predict that robots as fully capable as human soldiers will start to appear on the battlefield sometime between the years 2025 to 2035. This will undoubtedly mark a pivotal point in human history. The next war, claims Singer, could be fought partly by robots that respond to spoken commands in plain English and then figure out on their own how to get the job done.

When war becomes too easy

War is hell -- well, at least it's been that way in the past. Democratic governments and their citizens have had to be extremely careful about entering into costly and emotionally wrenching conflicts. But Singer now worries that unmanned systems represent the ultimate break between the public and its military:
With no draft, no need for congressional approval (the last formal declaration of war was in 1941), no tax or war bonds, and now the knowledge that the Americans at risk are mainly just American machines, the already falling bars to war may well hit the ground. A leader won’t need to do the kind of consensus building that is normally required before a war, and won’t even need to unite the country behind the effort. In turn, the public truly will become the equivalent of sports fans watching war, rather than citizens sharing in its ­importance.
Given this kind of scenario, cheap and costless unmanned wars would significantly lessen political repercussions. Singer argues that this is a frightening prospect and that it would "pervert the whole idea of the democratic process and citizenship as they relate to war." His fear is that, when a citizenry has no sense of the horrors and true cost of war, they will choose to go to war like any other policy decision, "weighed by the same calculus used to determine whether to raise bridge tolls." Public engagement will turn to indifference and titillation over all the war-porn on YouTube.

Singer's prognosis is grim:
When technology turns war into something merely to be watched, and not weighed with great seriousness, the checks and balances that ­undergird democracy go by the wayside. This could well mean the end of any idea of democratic peace that supposedly sets our foreign-policy ­decision ­making ­apart.
We're heading down a very strange and treacherous path.

5 comments:

The Missed Call Of Cthulhu said...
This comment has been removed by the author.
Anonymous said...

I'm long on the record about the political dangers of war drones, but I don't think the real danger is in the judgment of most Americans once the costs of war and lives and treasure falls. Rather, it's in the judgment of the very few Americans who will be tempted to use them to fight wars in secrecy.

You see, I've got that YouTube thing at my house, too, and the only "war porn" I've really seen on it so far wasn't made by elite snipers or gunship crews showing off how cool they and their toys are; they're things like backfired Israeli TV broadcasts featuring parents bemoaning the loss of their families to shelling.

Eliminating casualties on your side doesn't reduce the true cost of war. It simply eliminates a distraction we used to have for ignoring the "costs" one inflicts on the other side. That's a lot more difficult to hide than it used to be.

Anonymous said...

This makes me thinking, whether the next generation of army professionals will be recruited from growing group of professional computer gamers.

Russell Blackford said...

I need to think a lot more about this issue, but I do find myself sceptical when I come across a lot of left-leaning angst about the increasing power of machine systems like this (I don't mean in this forum).

Often, it seems to me that the people making the criticism are motivated by a pacifism/distaste for warfare (which is fair enough) that will lead them to want to criticise any advances in military technology, even advances that may actually reduce the suffering and loss of life on both sides of a military conflict.

Admittedly, the development of more and more powerful military technologies may have a downside. It just might make wars seem more tempting to the nation or coalition with the superior technology. It might well encourage the nations or other groups with inferior technology to rely on asymmetric methods that may breed greater fanaticism and involve more act-by-act cruelty.

Still, my hunch is that these technologies are likely to be desirable on balance, because they are likely to reduce rather than increase the total suffering in the world, as well as increasing the ability of nations that are more-or-less progressive to defend themselves from some kinds of enemies.

At the same time, their development in some form is probably inevitable. It's futile, and probably not even reasonable, to ask the Western powers not to develop technologies that promise to give them a military edge. The question is how we can avoid a situation where the inevitable availability of this kind of technology leads to greater suffering in the world in the form of wars that would otherwise not happen. I tend to think that that will have to be done by mechanisms other than trying to reliquish or impede the development of the technologies, though there may be some specific technologies that we should not develop (because of the suffering they could cause or their lack of discrimination, for example). As so often, it looks to me as if the devil is likely to be in the detail.

Unknown said...

It is good that war is so horrible, or we might grow to like it.
--Robert E. Lee