Showing posts with label military technology. Show all posts
Showing posts with label military technology. Show all posts

March 29, 2011

Stuxnet: A 'cybernetic weapon of mass destruction'

Check out Ralph Langner's rather harrowing TED Talk about the Stuxnet computer worm:


When first discovered in 2010, the Stuxnet computer worm posed a baffling puzzle. Beyond its unusually high level of sophistication loomed a more troubling mystery: its purpose. Ralph Langner and team helped crack the code that revealed this digital warhead's final target -- and its covert origins. In a fascinating look inside cyber-forensics, he explains how.
Ralph Langner is a German control system security consultant. He has received worldwide recognition for his analysis of the Stuxnet malware.

January 23, 2011

DARPA to develop new unified mathematical language for military

DARPA is working to create a unified mathematical language for everything the military sees or hears:
The armed forces are overwhelmed by all the data its various sensors are sniffing out. They want a single data stream that combines drone video feeds, cell phone intercepts, and targeting radar. Darpa’s solution, found in the brand-new Mathematics of Sensing, Exploitation, and Execution program is to design an algorithm that teaches the sensors how to interpret the world — how to think, how to learn and what data, accordingly to collect.

Sensors “process their signals as if they were seeing the world anew at every instant,” Darpa laments in its call for algorithms. To put it in Philosophy 101 terms, existence is, to a sensor, what William James called a “blooming, buzzing confusion“: an unmediated series of events to be vacuumed up, leaving an analyst overloaded with unsorted data. Wouldn’t it be better if a sensor could be taught how to filter the world through a perceptual prism, anticipating what the analyst needs to know?

That’s the specific military application of MSEE. But to get there, Darpa takes a rather unconventional path. To get the “economy and efficiency that derives from an intrinsic, objective-driven unification of sensing and exploitation,” it wants to create an “intrinsically integrated” algorithm for the machines to interpret reality. “All proposed research must describe a unifying mathematical formalism that incorporates stochasticity fundamentally,” Darpa tells would-be designers.
.

December 3, 2010

Jonathan Moreno: "Enhancement and National Security" [CFI conference on biomedical enhancement]

Johathan Moreno, author of Mind Wars, 2006, presenting on Enhancement and National Security. I blogged about his book here. He's also co-author of "Slouching Toward Policy: Lazy Bioethics and the Perils of Science Fiction" which I blogged about here.

National Security, the Brain and Behavior: Post WW II-era:
  • Hallucinogens
  • Neuropsychiatry of stress
  • Personality theory
  • Parapsychology
  • Performance enhancement
"Brainwashing" was an immediate concern after WWII. Work on hallucinogens took off soon thereafter, including extensive work by the CIA on LCD. Even ESP studies (by J. B. Rhine), coined the term "Psiops."

Quote from that time:
“The claimed phenomena and applications”…presented by several military officers, “range from the incredible to the outrageously incredible. The ‘anti-missile time warp,’ for example, is somehow supposed to deflect attack from nuclear warheads so that they will transcend time and explode among the ancient dinosaurs….One suggested application is a conception of the ‘First Earth Battalion,’ made up of ‘warrior monks’…including the use of ESP, leaving their bodies at will, levitating, psychic healing and walking through walls.”
"The Men Who Stare At Goats" [trailer]:



Today, some remarkable work is being done with fMRI's. Mapping and baseline readings currently being collected, can be used to understand and predict human behavior. Can show your thoughts 'on screen.' Transcranial magnetic stimulation (TMS) induces changes in brain activation. Could be used to alter a person's social behavior or attitudes. Influences brain functions including physical movement, visual perception, memory, reaction time, speech and mood.

Optogenetics: Seems to provide more specific information and control over neurons and their transmission.

Check out Giving the Grunts an Upgrade graphic from Wired (2007). Much of these solider technologies are available now. Not to mentioning the networking capabilities now at hand.

Potential for augmented reality. Soldiers will have their realities augmented -- e.g. a building painted in red signifies dangerous activity. Network effects: Objects tagged based on intel.

The "Anti-Conscience" pill. Beta blockers can be used to treat stress, prevent PTSD. Suppress release of hormones like norepinephrine that help encode memory. Might also reduce guilt feelings.

The trust drug? Natural oxytocin production is associated with trust behavior. May be artificially administered in a spray to encourage cooperation. Use in interrogations?

National Research Council May 12, 2009 predictions:

Near term (within 5 years)
  • Immersive virtual reality
  • Heartbeat variability
  • Galvanic skin response
Medium term (5-10 years)
  • In-helmet EEG for brain-machine interface
  • Head and torso impact protection
  • Biomarkers for predicting soldier response to environmental stress
Far term (10-20 years)
  • In-vehicle deployment of transcranial magnetic stimulation
  • Brain scanning to assess physiology
  • Ongoing (within 5 years with continued updating)
  • Field-deployable biomarkers of neural state
  • Biomarkers for sleep levels
In addition, advanced "lie detector" tests. Including portable lie-detector tests.

Q&A:
  • 90% of what DARPA does is bunk
  • There is no gene that is going to tell you who a terrorist is
  • There is no scanning technology that is going to tell you the intention of a would-be terrorist
  • No evidence that oxytocin was used in Guantanamo
  • Oxytocin makes you more trusting, but not more gullible
  • "Enablers" for soldiers, aka enhancements, may be detrimental to soldiers post-deployment: this is a potential problem. It is also a current problem ie PTSD; "it's a problem, but not necessary our problem" - DoD
  • "Everybody who goes to war feels they've been experimented upon"

August 22, 2010

Mind Wars: Brain Research and National Defense [book]

Along the lines of my previous post on neurosecurity and information warfare, check out this book by Jonathan D. Moreno: Mind Wars: Brain Research and National Defense (2006). Synopsis:
Imagine a future conflict in which one side can scan from a distance the brains of soldiers on the other side and learn what they may be planning or whether they are confident or fearful. In a crisply written book, University of Virginia ethicist Moreno notes that military contractors have been researching this possibility, as well as the use of electrodes embedded in soldiers' and pilots' brains to enhance their fighting ability. Moreno (Is There an Ethicist in the House?) details the Pentagon's interest in such matters, including studies of paranormal phenomena like ESP, going back several decades. Readers learn that techniques like hypersonic sound and targeted energetic pulses to disable soldiers are close to being used in the field, and even have everyday applications that make "targeted advertising" an understatement. Despite the book's title, Moreno doesn't limit his discussion to brain-related research; he explains the military's investigation of how to enhance soldiers' endurance and reaction time in combat as well as various nonlethal disabling technologies. The ethical implications are addressed throughout the book, but the author leaves substantive discussion to his praiseworthy last chapter.
I really don't know what to make of these claims that the US military is delving into the paranormal. Almost sounds like deliberate disinformation. Or that the higher-ups can't distinguish between sound scientific principles and the work of quacks.

Neurosecurity: The mind has no firewall

Neurosecurity and the potential for so-called 'mind hacking' has interested me for quite some time now, so I was surprised to discover that this topic was covered back in 1997 by Timothy. L Thomas. Writing in Parameters, the US army war college journal, Thomas warned that the American military risked falling behind in the burgeoning field of information warfare.

His particular concern was that military systems operators could be exploited as 'open systems.' "We need to spend more time researching how to protect the humans in our data management structures," he writes, "Nothing in those structures can be sustained if our operators have been debilitated by potential adversaries or terrorists who--right now--may be designing the means to disrupt the human component of our carefully constructed notion of a system of systems."

Thomas continues,
This "systems" approach to the study of information warfare emphasizes the use of data, referred to as information, to penetrate an adversary's physical defenses that protect data (information) in order to obtain operational or strategic advantage. It has tended to ignore the role of the human body as an information- or data-processor in this quest for dominance except in those cases where an individual's logic or rational thought may be upset via disinformation or deception. As a consequence little attention is directed toward protecting the mind and body with a firewall as we have done with hardware systems. Nor have any techniques for doing so been prescribed. Yet the body is capable not only of being deceived, manipulated, or misinformed but also shut down or destroyed--just as any other data-processing system. The "data" the body receives from external sources--such as electromagnetic, vortex, or acoustic energy waves--or creates through its own electrical or chemical stimuli can be manipulated or changed just as the data (information) in any hardware system can be altered.
---
Others, however, look beyond simple PSYOP ties to consider other aspects of the body's data-processing capability. One of the principal open source researchers on the relationship of information warfare to the body's data-processing capability is Russian Dr. Victor Solntsev of the Baumann Technical Institute in Moscow. Solntsev is a young, well-intentioned researcher striving to point out to the world the potential dangers of the computer operator interface. Supported by a network of institutes and academies, Solntsev has produced some interesting concepts. He insists that man must be viewed as an open system instead of simply as an organism or closed system. As an open system, man communicates with his environment through information flows and communications media. One's physical environment, whether through electromagnetic, gravitational, acoustic, or other effects, can cause a change in the psycho-physiological condition of an organism, in Solntsev's opinion. Change of this sort could directly affect the mental state and consciousness of a computer operator. This would not be electronic war or information warfare in the traditional sense, but rather in a nontraditional and non-US sense. It might encompass, for example, a computer modified to become a weapon by using its energy output to emit acoustics that debilitate the operator. It also might encompass, as indicated below, futuristic weapons aimed against man's "open system."
There's some great food for thought here, but as an important aside, it's worth noting that this article has a high bullshit to reality ratio. Overly enamored by the pseudoscientific areas of inquiry explored by his Russian colleagues, Thomas, quite bizarrely, placed as much credence on the development of viable alternative weapons (such as energy-based and psychotronic weapons) as he did on paranormal weapons. Consequently, the credibility of the entire article has to be thrown into question; I advise you to read this essay with a considerable grain of salt.

Link: "The Mind Has No Firewall"

H/T: JD

April 27, 2009

TED: P.W. Singer: Military robots and the future of war


For his TED talk, military robotics expert P.W. Singer shows how the widespread use of robots in war is changing the realities of combat. His talk is alarming and sobering -- but it needs to be said. In addition to this video, I suggest you read the article, "Towards a largely robotic battlefield."

Singer's bio:
Peter Warren Singer is the director of the 21st Century Defense Initiative at the Brookings Institution -- where his research and analysis offer an eye-opening take on what the 21st century holds for war and foreign policy. His latest book, Wired for War, examines how the U.S. military has been, in the words of a recent US Navy recruiting ad, "working hard to get soldiers off the front lines" and replacing humans with machines for bombing, flying and spying. He asks big questions: What will the rise of war machines mean to traditional notions of the battlefield, like honor? His 2003 book Corporate Warriors was a prescient look at private military forces. It's essential reading for anyone curious about what went on to happen in Iraq involving these quasi-armies.

Singer is a prolific writer and essayist (for Brookings, for newspapers, and for Wired.com’s great Threat Level), and is expert at linking popular culture with hard news on what’s coming next from the military-industrial complex. Recommended: his recent piece for Brookings called "A Look at the Pentagon's Five-Step Plan for Making Iron Man Real."
Via Theoretical Transhumanism.

April 8, 2009

Military robots will soon be able to fire on their own

The U.S. Defense Department is reportedly looking to develop autonomous armed robots that will eventually be able to find and destroy targets on their own. Instead of being controlled remotely, unmanned drones will have on-board computer programs that can decide whether they should fire their weapons.

"The trend is clear: Warfare will continue and autonomous robots will ultimately be deployed in its conduct," writes Ronald Arkin, a robotics expert at the Georgia Institute of Technology in Atlanta. "The pressure of an increasing battlefield tempo is forcing autonomy further and further toward the point of robots making that final, lethal decision," he predicted. "The time available to make the decision to shoot or not to shoot is becoming too short for remote humans to make intelligent informed decisions."

According to John Pike, an an expert on defense and intelligence matters, autonomous armed robotic systems probably will be operating by 2020.

But many fear that these robots will be unable to distinguish between legitimate targets and civilians in a war zone.

"We are sleepwalking into a brave new world where robots decide who, where and when to kill," said Noel Sharkey, an expert on robotics and artificial intelligence at the University of Sheffield, England.

More.

April 5, 2009

The Battlefield Extraction-Assist Robot (BEAR)

The Battlefield Extraction-Assist Robot (BEAR) has hydraulic arms that can support injured soldiers weighing up to 400 lbs -- more than most troopers in full gear -- and a system of wheels, tracks and joints that enable it to maneuver in all sorts of positions. The unit is being developed by VECNA Technologies primarily for military search-and-rescue missions.

The robot's legs also have wheels and tracks built into them that the robot can switch to by kneeling. Its head is designed to look like that of a teddy bear, to provide reassurance to the wounded soldier it is transporting.

It can balance on its back wheels to climb up a steep hill or roll over rough terrain while staying low to the ground. BEAR currently requires a human to drive it via remote control, but a more autonomous version is in the works.

March 30, 2009

The perils of nuclear disarmament: How relinquishment could result in disaster

Most everyone agrees that humanity needs to get rid of its nuclear weapons. There's no question that complete relinquishment will all but eliminate the threat of deliberate and accidental nuclear war and the ongoing problem of proliferation.

Indeed, the ongoing presence of nuclear weapons is the greatest single threat to the survival of humanity. To put the problem into perspective, there are currently 26,000 nuclear warheads ready to go -- 96% of which are controlled by the United States and Russia. These two countries alone could unleash the power of 70,000 Hiroshimas in a matter of minutes. In the event of an all-out nuclear war between the U.S. and Russia, it is estimated that as many as 230 million Americans and 56 million Russians would be killed by the initial blasts. The longer term impacts are nearly incalculable, but suffice it to say human civilization would be hard pressed to survive.

Given the end of the Cold War and the establishment of the START agreements, the idea of a deliberate nuclear war seems almost anachronistic. But the potential nightmare of an accidental nuclear exchange is all to real. We have already come very close on several occasions, including the Stanislav Petrov incident in 1983. We are living on borrowed time.

The assertion, therefore, that we need to completely rid ourselves of nuclear weapons appears more than reasonable; our very survival may depend on it. In fact, there are currently a number of initiatives underway that are working to see this vision come true. President Barack Obama himself has urged for the complete eliminate of nuclear weapons.

But before we head down the path to disarmament, we need to consider the consequences. Getting rid of nuclear weapons is a more difficult and precarious proposition than most people think. It's important therefore that we look at the potential risks and consequences.

There are a number of reasons for concern. A world without nukes could be far more unstable and prone to both smaller and global-scale conventional wars. And somewhat counter-intuitively, the process of relinquishment itself could increase the chance that nuclear weapons will be used. Moreover, we have to acknowledge the fact that even in a world free of nuclear weapons we will never completely escape the threat of their return.

The Bomb and the end of global-scale wars

The first and (so far) final use of nuclear weapons during wartime marked a seminal turning point in human conflict: the development of The Bomb and its presence as an ultimate deterrent has arguably preempted the advent of global-scale wars. It is an undeniable fact that an all-out war has not occurred since the end of World War II, and it is very likely that the threat of mutually assured destruction (MAD) has had a lot to do with it.

The Cold War is a case in point. Its very nature as a "war" without direct conflict points to the acknowledgment that it would have been ludicrous to engage in a suicidal nuclear exchange. Instead, the Cold War turned into an ideological conflict largely limited to foreign skirmishes, political posturing and espionage. Nuclear weapons had the seemingly paradoxical effect of forcing the United States and the Soviet Union into an uneasy peace. The same can be said today for India and Pakistan -- two rival and nuclear-capable nations mired in a cold war of their own.

It needs to be said, therefore, that the absence of nuclear weapons would dramatically increase the likelihood of conventional wars re-emerging as military possibilities. And given the catastrophic power of today's weapons, including the introduction of robotics and AI on the battlefield, the results could be devastating, even existential in scope.

So, while the damage inflicted by a restrained conventional war would be an order of magnitude lower than a nuclear war, the probably of a return to conventional wars would be significantly increased. This forces us to ask some difficult questions: Is nuclear disarmament worth it if the probability of conventional war becomes ten times greater? What about a hundred times greater?

And given that nuclear war is more of a deterrent than a tactical weapon, can such a calculation even be made? If nuclear disarmament spawns x conventional wars with y casualties, how could we measure those catastrophic losses against a nuclear war that's not really supposed to happen in the first place? The value of nuclear weapons is not that they should be used, but that they should never be used.

Upsetting the geopolitical balance

Today's global geopolitical structure has largely converged around the realities and constraints posed by the presence of apocalyptic weapons and by the nations who control them. Tension exists between the United States and Russia, but there are limits to how far each nation is willing to provoke the other. The same can be said for the United States' relationship with China. And as already noted, nuclear weapons may be forcing the peace between India and Pakistan (it's worth noting that conventional war between two nuclear-capable nations is akin to suicide; nuclear weapons would be used the moment one side senses defeat).

But should nuclear weapons suddenly disappear, the current geopolitical arrangement would be turned on its head. Despite its rhetoric, the United States is not a hegemonic power. We live in a de facto multi-polar geopolitical environment. Take away nuclear weapons and we get a global picture that looks startlingly familiar to pre-World War I Europe.

Additionally, the elimination of nuclear weapons could act as a destabilizing force, giving some up-and-coming nation-states the idea that they could become world players. Despite United Nations sanctions against invasion, some leaders could become bolder (and even desperate) and lose their inhibitions about claiming foreign territory; nations may start to take more calculated and provocative risks -- even against those nations who used to be nuclear powers.

Today, nuclear weapons are are being used to keep "rogue states" in check. It's no secret that the United States is willing (and even thinking about) bombing Iran as it works to develop its own nuclear weapons and threaten the region, if not the United States itself (Iran will soon have intercontinental ballistic capability; same for North Korea).

It can be said, therefore, that the composition of a nuclear-free world would be far more unstable and unpredictable than a world with nukes. Relinquishment could introduce us to an undesirable world in which new stresses and conflicts rival those posed by the threat of nuclear weapons.

It should be noted, however, that nuclear weapons do nothing to mitigate the threat of terrorism. MAD becomes a rather soft deterrent when "political rationality" comes into question; rationality can be a very subjective thing, as is the sense of self-preservation, particularly when nihilism and metaphysical beliefs come into play (i.e. religious fanaticism).

Nukes could still get in the wrong hands

Even in a world where nuclear weapons are eliminated it would not be outlandish to suggest that fringe groups, and even rogue nations, would still work to obtain the devices. The reasons for doing so are obvious, a grim turn of events that would enable them to take the rest of the world hostage.

Consequently, we can never be sure that a some point down the line, when push comes to shove for some countries or terrorist groups, that they'll independently work to develop their own nuclear weapons.

Dangers of the disarmament process

Should the nuclear capable nations of the world disarm, the process itself could lead to a number of problems. Even nuclear war.

During disarmament, for example, it's conceivable that nations would become distrustful of the others -- even to the point of complete paranoia and all-out belligerence. Countries would have to work particularly hard to show concrete evidence that they are in fact disarming. Any evidence to the contrary could severely escalate tension and thwart the process.

Some strategic thinkers have even surmised that there might be more incentive for a first strike with small numbers of nuclear weapons on both sides, where the attacking nations could hope to survive the conflict. As a result, it's suspected that the final stage of disarmament, when all sides are supposed to dismantle the last of their weapons, will be an exceptionally dangerous time. As a result, disarmament is paradoxically more likely to increase the probability of deliberate nuclear war.

And in addition, concealing a few nukes at this stage could give one nation an enormous military advantage over those nations who have been completely de-nuclearized. This is not as ridiculous as it might seem: it would be all too easy and advantageous for a nation to conceal a secret stockpile and attempt to gain political and military advantages by nuclear blackmail or attack.

Conclusion

I want to make it clear at this time that I am not opposed to nuclear disarmament.

What I am trying to do here is bring to light the challenges that such a process would bring. If we're going to do this we need to do a proper risk assessment and adjust our disarmament strategies accordingly (assuming that's even possible). I still believe that we should get rid of nuclear weapons -- it's just that our nuclear exit strategy will have to include some provisions to alleviate the potential problems I described above.

At the very least we need to dramatically reduce the number of live warheads. Having 26,000 active weapons and a stockpile the size of Mount Everest is sheer lunacy. There's no other word for it. It's a situation begging for disaster.

All this said, we must also admit that we have permanently lost our innocence. We will have to live with the nuclear threat in perpetuity, even if these weapons cease to physically exist. There will never be a complete guarantee that countries have completely disarmed themselves and that re-armament won't ever happen again in the future.

But thankfully, a permanent guarantee of disarmament is not required for this process. The longer we go without nuclear weapons, the better.

March 16, 2009

Futurist thinking at the Pentagon

Wondering how the U.S. military is planning for the future? A list of recent research articles by an internal Pentagon think tank shows where their collective head is at these days:
  • Chinese and Russian Asymmetrical Strategies for Space Dominance (2010 - 2030)
  • Changing Images Of Human Nature
  • The Future Of Undersea Warfare
  • Contradictions and Continuities: The Changing Moral Education Landscape
  • The End of Religiously Motivated Warfare: Lessons From The Puritans And Beyond
  • Information As Advertisement And Advertisement As Information
  • Role Of High Power Microwave Weapons In Future Intercontinental Conventional War
  • Europe 2025: Mounting Security Challenges Amidst Declining Competitiveness
  • Biometaphor For The Body Politic
  • The Changing Images Of Human Nature
  • Minorities in Turkish-Iranian Relations
  • Preserving American Primacy
  • Fighting A Nuclear-Armed Regional Opponent: Is Victory Possible?
  • After Next Nuclear Use
The fourth item on the list about the changing moral education landscape is extremely interesting and telling. This is the kind of research that can help the military 1) get in the head of its opponents, 2) craft effective propaganda and disinformation campaigns and 3) engage in social/memetic engineering endeavors.

I would like to know the details of the Biometaphor and Changing Images Of Human Nature articles and what the U.S. military is thinking here.

As for the nuclear-themed articles, as repugnant as those topics are I suppose it's necessary to think about these scenarios and plan accordingly.

Via WIRED's Danger Room.

February 24, 2009

The Big Dog Pack-Bot


Via KurzweilAI: "The U.S. Army has released new footage of the BigDog robot--a sophisticated, four-legged "pack-bot" designed to carry 340-pound payloads across all kinds of terrain--up or down hills, through ice, sand, snow, and dirt--by monitoring sensors in its legs and adjusting its posture accordingly."

While this machine is very impressive, I'm not convinced of its practicality. The thing is slow, loud as hell and it has to carry its own fuel. I wonder if an actual beast-of-burden wouldn't be more effective...

January 19, 2009

Towards a 'largely robotic' battlefield

A new book will hit store shelves later this week that will be of interest to those concerned about the ongoing roboticization and de-humanizing of military technology. The book, Wired for War: The Robotics Revolution and Conflict in the 21st Century, is authored by P. W. Singer, the director of the 21st Century Defense Initiative at the Brookings Institution. He has also published Children at War (2005) and Corporate Warriors: The Rise of the Privatized Military Industry (2003).

In a recent Wilson Quarterly article, Singer makes the claim that Pentagon planners are already provisioning for battlefields that will be, as they put it, "largely robotic." It's no secret that the U.S. military is developing a variety of unmanned weapons and seemingly futuristic technologies -- everything from automated machine guns and robotic ­stretcher ­bearers to tiny but lethal robots the size of insects.

As these weapons gain more and more autonomy, deeper questions arise. Singer poses difficult questions: "Can the new armaments reliably separate friend from foe? What laws and ethical codes apply? What are we saying when we send out unmanned ma­chines to fight for us? What is the “message” that those on the other side receive?" And ultimately, asks Singer, how will we remain masters of weapons that are immeasurably faster and more "intelligent" than we are?

Proxy killing

A fundamental problem as Singer sees it is the ease with which killing can now take place. He cites the example of the Predator, an unmanned aerial vehicle (UAVs). This propeller-­powered drone is 27 feet in length, can spend up to 24 hours in the air and flies at a height of 26,000 feet. Predators are flown by "reach-back" or "remote-split" operators -- military personnel who are 7,500 miles away and who fly the planes via satellite from a set of converted ­single-­wide trailers located mostly at Nellis and Creech Air Force bases in Nevada.

This type of operation has created a rather novel situation where "pilots" experience the psychological disconnect of being "at war" while dealing with their daily domestic routines. Singer notes the words of one Predator pilot, “You see Americans killed in front of your eyes and then have to go to a PTA meeting.” Says another, “You are going to war for 12 hours, shooting weapons at targets, directing kills on enemy combatants, and then you get in the car, drive home, and within 20 minutes you are sitting at the dinner table talking to your kids about their homework."

These days there are more than 5,300 drones in the U.S. military’s total arsenal and not a single mission happens without them. The Pentagon predicts future conflicts involving tens of thousands.

Better than humans

The appeal of robots is obvious. They don't need to be returned home in body bags after they've shot down. Moreover, robots don't come with typical human frailties and foibles. "They don’t get hungry," says Gordon Johnson of the Pentagon’s Joint Forces Command. "They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes." Johnson's comments sound eerily like the script from a Terminator movie.

And as these technologies improve, human capabilities are being increasingly pushed to their limits. Today's F-16 fighter jet can maneuver so fast and hard that its pilots black out. As a DARPA official has noted, "the human is becoming the weakest link in defense systems." Moving forward, autonomous weaponry will be increasingly used in place of humans. Eventually it will be robot versus robot -- especially when the theater of operations starts to function at technologic speed. The Pentagon is aware of this possibility, noting that "As the loop gets shorter and shorter, there won’t be any time in it for humans."

Failure to override

The inevitable question arises: Who will control robots that work autonomously and at suprahuman 'technologic' speed? There are already disturbing examples of 'failure to override' incidences -- when machines function outside of human control and can't be shut down. Today's Navy ships use the Aegis computer system which enters into "casualty" mode when all the humans onboard are dead. In this situation the guns go into a kind of berserker mode and the computer does its best to ensure that the ship doesn't get hit. As Singer notes, "Humans can override the Aegis system in any of its modes, but experience shows that this capability is often beside the point, since people hesitate to use this power. Sometimes the consequences are ­tragic."

Part of the problem is that humans are starting to give intelligent systems the benefit of doubt. In many cases the human power "in the loop" was actually only veto power -- but even that is a power that military personnel are often unwilling to use against the quicker (and what they viewed as superior) judgment of a ­computer.

The next step in this trend is to give robots the ability to fire back on their own. As Johnson notes:
Anyone who would shoot at our forces would die. Before he can drop that weapon and run, he’s probably already dead. Well now, these cowards in Baghdad would have to pay with blood and guts every time they shot at one of our folks. The costs of poker went up significantly. The enemy, are they going to give up blood and guts to kill machines? I’m guessing not.
Johnson in particular views this as not only logical but quite attractive.

Removing the human factor


Retired Army colonel Thomas Adams believes that the speed, confusion, and information overload of modern-day war will soon move the whole process outside "human space." He predicts that future weapons will be too fast, too small and too numerous; they will create an environment that's simply too complex for humans to direct.

The Joint Forces Command is very aware of this possibility, noting that autonomous robots on the battlefield will be the norm within 20 years. Military and robotics developers predict that robots as fully capable as human soldiers will start to appear on the battlefield sometime between the years 2025 to 2035. This will undoubtedly mark a pivotal point in human history. The next war, claims Singer, could be fought partly by robots that respond to spoken commands in plain English and then figure out on their own how to get the job done.

When war becomes too easy

War is hell -- well, at least it's been that way in the past. Democratic governments and their citizens have had to be extremely careful about entering into costly and emotionally wrenching conflicts. But Singer now worries that unmanned systems represent the ultimate break between the public and its military:
With no draft, no need for congressional approval (the last formal declaration of war was in 1941), no tax or war bonds, and now the knowledge that the Americans at risk are mainly just American machines, the already falling bars to war may well hit the ground. A leader won’t need to do the kind of consensus building that is normally required before a war, and won’t even need to unite the country behind the effort. In turn, the public truly will become the equivalent of sports fans watching war, rather than citizens sharing in its ­importance.
Given this kind of scenario, cheap and costless unmanned wars would significantly lessen political repercussions. Singer argues that this is a frightening prospect and that it would "pervert the whole idea of the democratic process and citizenship as they relate to war." His fear is that, when a citizenry has no sense of the horrors and true cost of war, they will choose to go to war like any other policy decision, "weighed by the same calculus used to determine whether to raise bridge tolls." Public engagement will turn to indifference and titillation over all the war-porn on YouTube.

Singer's prognosis is grim:
When technology turns war into something merely to be watched, and not weighed with great seriousness, the checks and balances that ­undergird democracy go by the wayside. This could well mean the end of any idea of democratic peace that supposedly sets our foreign-policy ­decision ­making ­apart.
We're heading down a very strange and treacherous path.

December 15, 2008

MDA's Multiple Kill Vehicle

Check out this incredible hover robot:


It's the product of the Missile Defense Agency and they call it the MKV-L (Multiple Kill Vehicle). The hover bot is meant to be used as a bundle of missile interceptors deployed by a larger carrier. Objectives of this particular test included having the MKV-L hover under its own power and prove its capability to recognize and track a surrogate target in a flight environment.

Terminator style hunter-killer, here we come.

November 28, 2008

Saletan: How Pakistan learned to stop worrying and love the killing machines

William Saletan of Slate has been covering the progress of the drone war in Pakistan. He notes, "Pakistan has become the world's first mechanical proxy war, with unmanned aerial vehicles hunting and killing bad guys so U.S. troops don't have to." In his most recent article Saletan describes how the drones are winning:
And now for the best news: the payoff. I'm not talking about the kills: We've already proved we can kill lots of people the old-fashioned way. I'm talking about the people we don't kill: civilians. We've talked before about hover time: the drones' superior ability to stay in the air, without fatigue or risk of death, allowing them to watch the ground and identify and track targets. If that level of persistence and precision improves our ability to distinguish the bad guys from everybody else, then the bottom line isn't just kills. It is, in Clapper's words, fewer "collateral casualties." If you look back at reports from the ground, that's exactly what stands out about the recent drone attacks: We've been hitting an impressively high ratio of bad guys, especially senior bad guys, to innocents. Yes, some innocents have died. But no counterinsurgent air war has ever been this precise.
Saletan concludes by suggesting that these tactics may solve the problem of terrorist insurgency...or maybe it will create something worse.

Entire article.