Psychobot

It is interesting… how weapons reflect the soul of their maker.

                  Don Delillo,  The Underworld

Singularity, or something far short of it, the very real revolution in artificial intelligence and robotics is already encroaching on the existential nature of aspects of the human condition that have existed for as long as our history.  Robotics is indeed changing the nature of work, and is likely to continue to do so throughout this century and beyond. But, as in most technological revolutions, the impact of change is felt first and foremost in the field of war.

In 2012 IEET Fellow Patrick Lin had a fascinating article in the Atlantic about a discussion he had at the CIA revolving around the implications of the robotics revolution. The use of robots in war results in all kinds of questions in the area of Just-War theory that have yet to even begun to be addressed. An assumption throughout Lin’s article is that robots are likely to make war more not less ethical as robots can be programmed to never target civilians, or to never cross the thin line that separates interrogation from torture.

This idea, that the application of robots to war could ultimately take some of the nastier parts of the human condition out of the calculus of warfare is also touched upon from the same perspective in Peter Singer’s Wired for War.  There, Singer brings up the case of Steven Green, a US soldier charged with the premeditated rape and murder of a 14 year old Iraqi girl.  Singer contrast the young soldier “swirling with hormones” to the calm calculations of a robot lacking such sexual and murderous instincts.

The problem with this interpretation of Green is that it relies on an outdated understanding of how the brain works. As I’ll try to show Green is really more like a robot-soldier than most human beings.

Lin and Singer’s idea of the “good robot” as a replacement for the “bad soldier” is based on a understanding of the nature of moral behavior that can be traced, as most things in Western civilization, back to Plato. In Plato’s conception, the godly part of human nature, it’s reason, was seen as a charioteer tasked with guiding chaotic human passions. People did bad things whenever reason lost control. The idea was updated by Freud with his ID (instincts) Ego (self) and Super-Ego (social conscience). The thing is, this version of why human beings act morally or immorally is most certainly wrong.

The neuroscience writer Jonah Lehrer in his How we Decide has a chapter, The Moral Mind, devoted to this very topic.  Odd thing is the normal soldier does not want to kill anybody- even enemy combatants. He cites a study of thousands of American soldiers after WWII done by  U.S. Army Brigadier General S.L.A Marshall.

His shocking conclusion was that less than 20 percent actually shot at the enemy even when under attack. “It is fear of     killing” Marshall wrote “rather than fear of being killed, that is the most common cause of battle failure of the individual”. When soldiers were forced to directly confront the possibility of directly harming another human being- this is a personal moral decision- they were literally incapacitated by their emotions. “At the most vital point of battle”, Marshall wrote, “the soldier becomes a conscientious objector”.

After this study was published, the Army redesigned it’s training to reduce this natural moral impediment to battlefield effectiveness. “What was being taught in this environment is the ability to shoot reflexively and instantly… Soldiers are de-sensitized to the act of killing until it becomes an automatic response. pp. 179-180

Lehrer, of course, has been discredited as a result of plagiarism scandals, so we should accept his ideas with caution, yet, they do suggest what already know that the existential condition of war is that it is difficult for human beings to kill one another, and well it should be. If modern training methods are meant to remove this obstruction in the name of combat effective they also remove the soldier from the actual moral reality of war. This moral reality is the reason why wars should be fought infrequently and only under the most extreme of circumstances. We should only be willing to kill other human beings under the most threatening and limited of conditions.

The designers of  robots warriors are unlikely to program this moral struggle with killing into their machines. Such machines will kill or not kill a fellow sentient beings as they are programmed to do. They were truly be amoral in nature, or to use a loaded and antiquated term, without a soul.

We could certainly program robots with ethical rules of war, as Singer and Lin suggest. These robots would be less likely to kill the innocent in the fear and haste of the fog of war. It is impossible to imagine that robots would commit the horrible crime of rape, which is far too common in war. All these things are good things. The question for the farther future is, how would a machine with a human or supra-human level of intelligence experience war? What would be their moral/existential reality of war compared to how the most highly sentient creatures today, human beings, experience combat.

Singer’s use of Steven Green as a flawed human being whose “hormones” have overwhelmed his reason, as ethically inferior to the cold reason of artificial intelligence which have no such passions to control is telling, and again is based on the flawed Plato/Freud model of the conscience of human beings.  A clear way to see this is by looking inside the mind of the rapist/murderer Green who, before he had committed his crime had been quoted in the Washington Post as saying:

I came over here because I wanted to kill people…

I shot a guy here when we were out at a traffic checkpoint, and it was like nothing. Over here killing people is like squashing an ant. I mean you kill somebody and it’s like ‘All right, let’s go get some pizza’.

In other words, Green is a psychopath.

Again we can turn to Lehrer who in describing the serial killer John Wayne Gacy:

According to the court appointed psychiatrist, Gacy seemed incapable of experiencing regret, sadness, or joy. Instead his inner life consisted entirely of sexual impulses and ruthless rationality. p.169

It is not the presence of out of control emotions that explain the psychopath, but the very absence of emotion. Psychopaths are unmoved by the very sympathy that makes it difficult for normal soldiers to kill. Unlike other human beings they show no emotional response when shown depictions of violence. In fact, they are unmoved by emotions at all.  For them, there are simply “goals” (set by biology or the environment) that they want to achieve. The means to those goals, including murder, are, for them, irrelevant. Lehrer quotes G.K. Chesterson:

The madman is not the man who has lost his reason. The madman is the man who has lost everything except his reason.

Whatever the timeline, we are in the process of creating sentient beings who will kill other sentient beings, human and machine, without anger, guilt, or fear. I see no easy way out of this dilemma, for the very selective pressures of war, appear to be weighted against programming such moral qualities (as opposed to rules for who and when to kill) into our machines.  Rather than ushering in an era of “humane” warfare, on the existential level, that is in the minds of the beings actually doing the fighting, the moral dimension of war will be relentlessly suppressed. We will have created what is in effect, an army of psychopaths.

Of drones and democracy

P.W. Singer has a fascinating article in last Sunday’s edition of the New York Times about the implications of the current robotics revolution in warfare for democracy.

Singer’s reputation of someone who asks the hard questions (and asks them first) about war was gained in his studies of the rise of privatized warfare  and the increasing use of children in warfare.  On both of those issues, Singer was among the first to correctly identify disturbing trends. His perspective shows little ideological bent, merely the search for truth. On these issues he was almost a lone voice crying in the wilderness, as he is in bringing to our knowledge the troubling questions being brought about as the revolution in computer technology and robotics finds itself increasing applied to war. His goal, as always, appears to be to reveal the obvious trends in front of us to which we are blind, and in doing so start a conversation we should already be having.

I think a short review of Singer’s Wired For War  is in order, so that his N.Y. Times article can be seen in its full context.  In that book Singer takes readers on a wild ride through the current robotics revolution in warfare. He sees our era as akin to WWI when new technologies like the tank and airplane were thrown into the field, but no one new yet how to actually use them. (224)

The military now funds over 80% of American spending on artificial intelligence. Much of this funding flows through the Defense Advanced Research Agency (DARPA).  Joel Garreau of the Washington Post SAYS the mission of DARPA “is to accelerate the future into being”. (78)

Not only the US Army, but also the Navy and Airforce (which wants 45% of its bomber fleet to be composed of unmanned vehicles) are rushing to develop unmanned systems. Some of these systems seem straight out of science-fiction, whether swarming insect sized robots or super-sized planes able to stay aloft for months or even years.  (117-118).

Many of the new robots resemble animals, such as a robotic dog .

What does all of this have to do with democracy?

In his article, he makes the point, that increasingly, the US is relying on its advantage in military technology, the most famous of which, are the use of unmanned drones, to wage, what is in effect a constant war, with out the democratic oversight and control that is supposed to be the job of our elected representatives in the Congress. The biggest cost in war, the lives of citizens, no longer at risk, the American government can wage war on terrorist targets throughout the world with seeming impunity.

As a prime example, he cites the recent US military action in Libya.

Starting on April 23, American unmanned systems were deployed over Libya. For the next six months, they carried out at least 146 strikes on their own. They also identified and pinpointed the targets for most of NATO’s manned strike jets. This unmanned operation lasted well past the 60-day deadline of the War Powers Resolution, extending to the very last airstrike that hit Colonel Qaddafi’s convoy on Oct. 20 and led to his death.

Choosing to make the operation unmanned proved critical to initiating it without Congressional authorization and continuing it with minimal public support. On June 21, when NATO’s air war was lagging, an American Navy helicopter was shot down by pro-Qaddafi forces. This previously would have been a disaster, with the risk of an American aircrew being captured or even killed. But the downed helicopter was an unmanned Fire Scout, and the story didn’t even make the newspapers the next day.

Singer’s conclusion:

We must now accept that technologies that remove humans from the battlefield, fromunmanned systems like the Predator to cyberweapons like the Stuxnet computer worm, are becoming the new normal in war.

And like it or not, the new standard we’ve established for them is that presidents need to seek approval only for operations that send people into harm’s way — not for those that involve waging war by other means.

WITHOUT any actual political debate, we have set an enormous precedent, blurring the civilian and military roles in war and circumventing the Constitution’s mandate for authorizing it. Freeing the executive branch to act as it chooses may be appealing to some now, but many future scenarios will be less clear-cut. And each political party will very likely have a different view, depending on who is in the White House.

America’s founding fathers may not have been able to imagine robotic drones, but they did provide an answer. The Constitution did not leave war, no matter how it is waged, to the executive branch alone.

In a democracy, it is an issue for all of us.