The humanitarian case against killer robots

Nov 26, 2013

Noel Sharkey, the chairman of the International Committee for Robot Arms Control, argued his case against killer robots last Friday at Northeastern University, saying that autonomous machines should not be allowed to make the decision to kill people on the battlefield.

"We're on a course toward fully automating warfare," Sharkey warned in a two-hour lecture on the political, ethical, and legal implications of robotic weapons. "Who in his right mind would automate the decision to kill?"

Later in the day, Sharkey moderated a panel discussion of drones and . The panelists comprised Max Abrahms, an assistant professor of political science at Northeastern; Denise Garcia, a member of ICRAC and an associate professor of and international affairs at Northeastern; and Patrick B. Johnson, a political scientist at the RAND Corporation, a nonprofit global policy think tank.

The two-part event—the second in a new series titled "Controversial Issues in Security Studies"—was sponsored by the Northeastern Humanities Center and the Department of Political Science. Garcia organized the program with the support of Gerard Loporto, LA'73, and his family.

Sharkey, for his part, is a preeminent expert in robotics and artificial intelligence. As a spokesperson for the Campaign to Stop Killer Robots, he traveled to Geneva earlier this month to convince the United Nations' Convention on Conventional Weapons to ban killer robots before they're developed for use on the battlefield. Fully autonomous weapons, which do not yet exist, would have the ability to select and then destroy military targets without human intervention.

The rise of the machines is a hot-button issue in Washington. In response to criticism of the administration's use of combat drones, President Obama delivered a speech at the National Defense University in May, promising that the U.S. would only use drones against a "continuing and imminent threat against the American people.

"The terrorists we are after target civilians and the from their acts of terrorism against Muslims dwarfs any estimate of civilian casualties from drone strikes," he added. "So doing nothing is not an option."

In his lecture last Friday, Sharkey laid out his argument against the autonomous Terminator-like weapons. He began by noting that their use could violate at least two principles of international humanitarian law—the principle of distinction, which posits that battlefield weapons must be able to distinguish between combatants and civilians; and the principle of proportionality, which posits that attacks on military objects must not cause excessive loss of civilian life in relation to the foreseeable military advantage.

Of the principle of proportionality, he said, "You can kill civilians provided it's proportional to direct military advantage, but that requires an awful lot of thinking and careful years of planning. We must not let robots do that under any circumstance."

Sharkey also censured the CIA's use of the nation's current fleet of combat drones in countries with which the U.S. is not at war. "I would like to ask the CIA to stop killing civilians in the name of collateral damage," Sharkey pleaded. "I really don't like seeing children being killed, because there's no excuse for that whatsoever."

In his opening remarks, Stephen Flynn, the director of Northeastern's Center for Resilience Studies, articulated the difficulties of rapidly assimilating new warfare technology. "Technology always outpaces our ability to sort out what the guidelines are," he explained. "What could be tactically effective could also be strategically harmful.

"Issues of policy, technology, and morality are all in play, but they don't lend themselves to slogans or bumper stickers," he added. "We won't have effective conversations unless we delve into these issues."

In the Q-and-A session, more than a dozen students heeded Flynn's advice by asking Sharkey several tough questions. The former president of the Northeastern College Democrats asked Sharkey what students could do to stop the development of killer robots, prompting Sharkey to encourage students to start a youth movement to raise awareness of their dangers.

Another student asked Sharkey whether automating warfare would decrease the human death toll. "I don't mind protecting soldiers on the ground, but [the use of killer robots] might lead to more battles than you want to be in," he explained. "If they're increasing terrorism, then who are they really protecting?"

Explore further: Students' autonomous robot project could be a lifesaver

add to favorites email to friend print save as pdf

Related Stories

Ban 'killer robots,' rights group urges

Nov 19, 2012

Hollywood-style robots able to shoot people without permission from their human handlers are a real possibility and must be banned before governments start deploying them, campaigners warned Monday.

Rights group launches campaign to ban 'killer robots'

Apr 23, 2013

A global rights group launched a campaign on Tuesday to ban Terminator-style "killer robots" amid fears the rise of drone warfare could lead to machines with the power to make their own decisions about killing ...

Futuristic robots, friend or foe?

Apr 22, 2008

A leading robotics expert will outline some of the ethical pitfalls of near-future robots to a Parliamentary group today at the House of Commons. Professor Noel Sharkey from the University of Sheffield will explain that robots ...

Recommended for you

Microsoft unveils Xbox in China as it faces probe

1 hour ago

Microsoft on Wednesday unveiled its Xbox game console in China, the first to enter the market after an official ban 14 years ago, even as it faces a Chinese government probe over business practices.

Teens love vacation selfies; adults, not so much

1 hour ago

(AP)—Jacquie Whitt's trip to the Galapagos with a group of teenagers was memorable not just for the scenery and wildlife, but also for the way the kids preserved their memories. It was, said Whitt, a "selfie ...

Tiny UAVs and hummingbirds are put to test

2 hours ago

Hummingbirds in nature exhibit expert engineering skills, the only birds capable of sustained hovering. A team from the US, British Columbia, and the Netherlands have completed tests to learn more about the ...

US spy agency patents car seat for kids

5 hours ago

Electronic eavesdropping is the National Security Agency's forte, but it seems it also has a special interest in children's car seats, Foreign Policy magazine reported Wednesday.

User comments : 17

Adjust slider to filter visible comments by rank

Display comments: newest first

Tristan
3 / 5 (2) Nov 26, 2013
"Who in his right mind would automate the decision to kill?"

Someone for whom a winning outcome is more important than the ethical implications of automated killing, ie. pretty much anyone in the military who would be making the decision on whether to fund research into killer robots.

They're coming whether we like it or not; I'll bet on the determination of the military to make this a reality over the hand wringing of a bunch of professors of political science any day of the week, regardless of my personal feelings on the matter.
kochevnik
1.4 / 5 (9) Nov 26, 2013
"Who in his right mind would automate the decision to kill?"

Obama and the CIA with drones

USA began a program "Shoot anything that moves" in Vietnam. Citizens are used as targets when enemy soldiers are absent. Laos was bombed into the stone age
dav_daddy
1 / 5 (4) Nov 26, 2013
It wouldn't shock me if they already were in use in use.
TheGhostofOtto1923
2.6 / 5 (5) Nov 26, 2013
It wouldn't shock me if they already were in use in use.
Well they are. Al qaida sends them into the field all the time.

"autonomous machines should not be allowed to make the decision to kill people on the battlefield."

-I would tend to trust a machine programmed with all the parameters that any soldier would have for discerning combatants from innocents and friendlies, over an enraged and terrified and and confused and possibly wounded soldier to make the decision whether to pull the trigger or not.

Machines will be more accurate, less distracted, more consistent, and more reliable in this endeavor than any human, and their use will thus be more humane. The people who object to 'killer robots' are mistaken in their belief that their objections will make wars harder to initiate and sustain.

Wars are inevitable given the aggressive growth rates of religionist cultures, and containing these cultures is an imperative.
VendicarE
3.7 / 5 (6) Nov 27, 2013
Uncle Sam teaches the world that there is nothing wrong with murdering people you don't like.

Mass Murder brings victory.
zaxxon451
4 / 5 (4) Nov 27, 2013

Mass Murder brings victory.


And profit$$ for the corporate-military complex.
tadchem
1 / 5 (2) Nov 27, 2013
The primary objective in war is NOT to 'kill the enemy'. It is to make the enemy yield. Sun Tzu understood this 2500 years ago.
The effectiveness of battlefield robots will be measured in their ability to terrify the enemy into fleeing or surrendering. Unfortunately they will probably become just another expensive target, like Naval ships in the Falklands War.
QuixoteJ
1 / 5 (4) Nov 27, 2013
[Otto] I would tend to trust a machine programmed with all the parameters that any soldier would have for discerning combatants from innocents and friendlies, over an enraged and terrified and and confused and possibly wounded soldier to make the decision whether to pull the trigger or not.
It horrifies me to see that people are actually thinking like this.
TheGhostofOtto1923
1 / 5 (3) Nov 27, 2013
It horrifies me to see that people are actually thinking like this.
I'm sure all war terrifies you yes? It terrifies most of us. You would rather ignore it and hope that it goes away. But it won't. It never does.

Every generation thinks that it is finally rational enough, educated enough, pious enough, and mature enough to avoid war. But then along comes an enemy that it absolutely has to fight, or it WILL be destroyed. Why is that do you think?

The west is the only culture which has the means to end war forever. It has achieved zero growth. But it is endangered as never before by the ancient and obsolete cultures which seek to conquer by out-reproducing their neighbors, and overrunning them.

This means that until these cultures are destroyed, war is inevitable. Unavoidable. And it is the kind of war that the west cannot afford to lose because it threatens not only it's own existence but the survival of the entire species.

Tech wins wars. It is our primary advantage.
TheGhostofOtto1923
1 / 5 (3) Nov 27, 2013
primary objective in war is NOT to 'kill the enemy'. It is to make the enemy yield. Sun Tzu understood this 2500 years ago.
The effectiveness of battlefield robots will be measured in their ability to terrify the enemy into fleeing or surrendering. Unfortunately they will probably become just another expensive target, like Naval ships in the Falklands
You've got that entirely wrong. Sun tsu and Von clauswitz both emphasized that destroying the enemy was more important than taking ground. Sun Tsus strategies were all based on tactics which enabled enemy forces to be overcome and DESTROYED.

Battlefield robots will be measured in their effectiveness at annihilating the enemy. Desert storm was a victory because 100k of saddams forces were killed. Lined up in neat rows and carpet bombed into mush.

The exact same tactic was used against the Taliban in northern Afghanistan and the north Koreans. Peace reigned on the peninsula for 2 gens until the pop recovered.
Telekinetic
3 / 5 (6) Nov 27, 2013
The sock puppet "Nikolaus" belongs to TheGhostofOtto1923. Pay your income taxes, hypocrite!
TheGhostofOtto1923
1 / 5 (3) Nov 27, 2013
The sock puppet "Nikolaus" belongs to TheGhostofOtto1923. Pay your income taxes, hypocrite!
Not true froufrou. I see you are beset with many suckpuppets, some uprating you and some downrating you. None of them are me.

As I have suggested in the past, if you would only stop posting such outrageous crap you wouldnt have to spend so much time uprating yourself to counter all the flak you get. But I understand this may not be possible for you to do. Oh well.
grpugh
not rated yet Dec 02, 2013
Lot of silly comments: think the author is hoping to be the next Al Gore, by finding a new and profitable line of work.
QuixoteJ
1 / 5 (1) Dec 03, 2013
I'm sure all war terrifies you yes? It terrifies most of us. You would rather ignore it and hope that it goes away. But it won't. It never does.
No, what horrifies me is that you would rather have a computer (which was programmed by the lowest bidder) aiming double 50 cal machine guns at your head during a confusing hostile situation rather than a trained human soldier or police officer. What horrifies me most, actually, is the mentality behind that preference existing in the minds of a large number of people.

Robot fire fighters are okay, but that's about all.

To be comfortable among robots which are carrying deadly weapons should be considered a form of madness.
TheGhostofOtto1923
1 / 5 (2) Dec 03, 2013
computer (which was programmed by the lowest bidder) aiming double 50 cal machine guns at your head during a confusing hostile situation rather than a trained human soldier or police officer
That 'lowest bidder' will be designing and programming to a strict set of performance specs. Soldiers and policemen in contrast are full of all sorts of defects and human frailties. Need I link the current set of brutality and corruption trials for you? Look them up yourself.

Machines will be utterly consistent. They will not be prone to racial prejudice, rage, pain, or hatred. And the machine soldiers which finally see action will be exhaustively tested, redesigned, and improved before they are fielded.

Your objections appear any time some new tech comes along which makes fighting more efficient. Artillery, firearms, machine guns, planes, missiles - they all separate the killer from the one being killed. People like you think soldiers should be staring each other in the eye.
TheGhostofOtto1923
1 / 5 (2) Dec 03, 2013
Here's one way that drones might fight along side soldiers, being directed against specific targets with partial autonomy
http://m.youtube.com/watch?v=8mvOH35_uwQ&desktop_uri=%2Fwatch%3Fv%3D8mvOH35_uwQ
QuixoteJ
not rated yet Dec 05, 2013
Otto, I think your faith in killer robots (or more accurately, those who manufacture them or control them) is misplaced, that's all. You are obviously a very wise person, as is evidenced by many of your posts, but I would think that same wise quality of yours would make you wary of automous gun toting robots. I just don't get it. Handing over life and death war decisions to robots (and indirectly, to those who made them who are not, themselves, soldiers or personally at risk in the moment) seems like a bad idea to me for a vast number of reasons.

People like you think soldiers should be staring each other in the eye.
Absolutely. And I believe any real soldier would welcome that.