UN report wants moratorium on killer robots (Update)

May 02, 2013 by Peter James Spielmann
In this undated artist's rendering provided by BAE Systems, Taranis aircraft is shown. A new United Nations draft report posted online this week objects to the use of weapons systems like the Taranis that can attack targets without any human input. The report for the U.N. Human Rights Commission deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. (AP Photo/BAE Systems)

Killer robots that can attack targets without any human input "should not have the power of life and death over human beings," a new draft U.N. report says.

The report for the U.N. Human Rights Commission posted online this week deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. The debate dates to author Isaac Asimov's first rule for robots in the 1942 story "Runaround:" ''A robot may not injure a human being or, through inaction, allow a human being to come to harm."

Report author Christof Heyns, a South African professor of human rights law, calls for a worldwide moratorium on the "testing, production, assembly, transfer, acquisition, deployment and use" of killer robots until an international conference can develop rules for their use.

His findings are due to be debated at the Human Rights Council in Geneva on May 29.

According to the report, the United States, Britain, Israel, South Korea and Japan have developed various types of fully or semi-autonomous weapons.

In the report, Heyns focuses on a new generation of weapons that choose their targets and execute them. He calls them "lethal autonomous robotics," or LARs for short, and says: "Decisions over life and death in armed conflict may require compassion and intuition. Humans—while they are fallible—at least might possess these qualities, whereas robots definitely do not."

He notes the arguments of robot proponents that death-dealing autonomous weapons "will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape."

The report goes beyond the recent debate over drone killings of al-Qaida suspects and nearby civilians who are maimed or killed in the air strikes. Drones do have human oversight. The killer robots are programmed to make autonomous decisions on the spot without orders from humans.

Heyns' report notes the increasing use of drones, which "enable those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places, and stay out of the line of fire.

In this Feb. 4, 2011 photo released by the U.S. Navy and Northrup Grumman, the navy X-47B Unmanned Combat Air System Demonstration aircraft successfully completes its historic first flight at Edwards Air Force Base, Calif. A new United Nations draft report posted online this week objects to the use of weapons systems like the X-47B that can attack targets without any human input. The report for the U.N. Human Rights Commission deals with legal and philosophical issues involved in giving robots lethal powers over humans, echoing countless science-fiction novels and films. (AP Photo/Alan Radecki, Northrup Grumman, Navy)

"Lethal autonomous robotics (LARs), if added to the arsenals of States, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill - and their execution," he wrote.

His report cites these examples, among others, of fully or semi-autonomous weapons that have been developed:

— The U.S. Phalanx system for Aegis-class cruisers, which automatically detects, tracks and engages anti-air warfare threats such as anti-ship missiles and aircraft.

— Israel's Harpy, a "Fire-and-Forget" autonomous weapon system designed to detect, attack and destroy radar emitters.

— Britain's Taranis jet-propelled combat drone prototype that can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It also can defend itself against enemy aircraft.

— The Samsung Techwin surveillance and security guard robots, deployed in the demilitarized zone between North and South Korea, to detect targets through infrared sensors. They are currently operated by humans but have an "automatic mode."

Current weapons systems are supposed to have some degree of human oversight. But Heyns notes that "the power to override may in reality be limited because the decision-making processes of robots are often measured in nanoseconds and the informational basis of those decisions may not be practically accessible to the supervisor. In such circumstances humans are de facto out of the loop and the machines thus effectively constitute LARs," or killer robots.

Separately, another U.N. expert, British lawyer Ben Emmerson, is preparing a special investigation for the U.N. General Assembly this year on drone warfare and targeted killings.

His probe was requested by Pakistan, which officially opposes the use of U.S. drones on its territory as an infringement on its sovereignty but is believed to have tacitly approved some strikes in the past. Pakistani officials say the drone strikes kill many innocent civilians, which the U.S. has rejected. The other two countries requesting the investigation were two permanent members of the U.N. Security Council, Russia and China.

In April, an alliance of activist and humanitarian groups led by Human Rights Watch launched the "Campaign to Stop Killer Robots" to push for a ban on fully autonomous weapons. The group applauded Heyns' draft report in a statement on its web site.

Explore further: Wearable 4MM jetpack tested on speed, agility for runners (w/ Video)

More information: The U.N. draft report: www.ohchr.org/Documents/HRBodi… 3/A-HRC-23-47_en.pdf

4 /5 (9 votes)
add to favorites email to friend print save as pdf

Related Stories

Rights group launches campaign to ban 'killer robots'

Apr 23, 2013

A global rights group launched a campaign on Tuesday to ban Terminator-style "killer robots" amid fears the rise of drone warfare could lead to machines with the power to make their own decisions about killing ...

Ban 'killer robots,' rights group urges

Nov 19, 2012

Hollywood-style robots able to shoot people without permission from their human handlers are a real possibility and must be banned before governments start deploying them, campaigners warned Monday.

Asimov's robots live on twenty years after his death

Apr 09, 2012

Renowned author Isaac Asimov died 20 years ago today. Although he wrote more than 500 books, the robot stories he began writing at age 19 are possibly his greatest accomplishment. They have become the starting ...

Robot guards being tested in South Korea

Apr 18, 2012

(Phys.org) -- South Korea, a nation with a self-proclaimed goal of being a leader in robotics technology has, through the Asian Forum of Corrections (AFC), begun testing the feasibility of using robots as ...

Recommended for you

Robot Boris learning to load a dishwasher (w/ Video)

Sep 12, 2014

Researchers at the University of Birmingham in the U.K. have set themselves an ambitious goal: programming a robot in such a way as to allow it to collect dishes, cutlery, etc. from a dinner table, and put ...

Deep-sea diver hand offers freedom and feedback

Sep 12, 2014

Bodyskins and goggles are hardly the solution for divers who need to reach extreme depths. The Atmospheric Dive Suit (ADS) gives them the protection they need. Recently, The Economist detailed a technology ...

User comments : 27

Adjust slider to filter visible comments by rank

Display comments: newest first

axemaster
2.7 / 5 (9) May 02, 2013
I couldn't agree more. The use of LARs should be severely restricted except in the case of for example anti-missile point defense systems such as the Phalanx.

The most critical issue here for me is the fact that LARs allow military officers to avoid responsibility for their actions. Someone might "accidentally" program LARs with settings that result in them killing hundreds of innocents, but how would you prosecute what could have been an honest mistake? There's simply no way to determine if it was intentional, or just a syntax error. Moreover, militaries could carry out slaughters on a large scale, and recuse themselves from responsibility. I can just imagine them saying "the terrorists hacked us", or some other BS.

Robots can be made autonomous, but every time a weapon is used, it must be at the explicit direction of a human being. Clicking a button, pulling a trigger, whatever. There must be no ambiguity, or we are heading down an extremely dangerous path.
Claudius
1.7 / 5 (11) May 02, 2013
Drones do have human oversight.


Even with human oversight, there are reports of numerous civilian casualties, even reports of deliberately targeting rescue efforts.

Time to ban drones, killer robots, land mines, poison gas, etc. War crimes, all of them.
Frostiken
2.2 / 5 (10) May 02, 2013
Even with human oversight, there are reports of numerous civilian casualties, even reports of deliberately targeting rescue efforts.


I forgot bombs didn't kill civilians before the advent of robotics and remote control systems.
powerup1
3.5 / 5 (8) May 02, 2013
Much of this talk about "killer robots" is driven by fear inspired from sci-fi books and films and has very little to do with the reality of the situation.
Gawad
1.7 / 5 (6) May 02, 2013
@powerup1: Really? And what then is the "reality" of the situation. And while you're at it, what is the reality of the situation to be in 20 yrs and how are this report's findings erroneous?
Jeddy_Mctedder
1.4 / 5 (8) May 02, 2013
robotic armies are the ONLY way mankind can hope to stop requiring excessive population growth as the means towards a societies' dominance over other societies.

in the future, the societies that age slowly, reproduce slowly, and build massive giant robot armies will overcome the need to reproduce fast and support massive demands for all sort of things that large populations require to be sustained in order to produce generations of young men capable of fighting war.

one expects that world war 3, unlike world war 2 and 1, will not in any way be related to how many people and how 'large' a population for doing manufacturing work the soveirgn's war machine possesses. world war 3 will not also be the characteture of nuclear winter simpleton's believe it to be. it will be a robot war , and it is already under way, we just haven't seen the fireworks quite yet. thank god for robot armies, they are civilization's only hope for succesfully getting past our next singularity (world war 3).
Noodle_Naut
2.5 / 5 (11) May 02, 2013
I love sci-fi, absolutely love sci-fi, but using it to substitute for reality and affect policy decisions is utter lunacy. This is one one of the strange unforeseen effects of democracy. It is little different than making decisions based on cowboy movies, Tom & Jerry cartoons or, in the past, fables.

It started with Frankenstein in 1818, and shows no signs of slowing. There was fear of tech taking jobs far earlier but as a physical or mortal danger where the public is sold on its genuine possibility, no. The fear of technology has crippled the advance of technology to enhance our lives.

Fear of radiation has lead to 10s of thousands of deaths from food poisoning and degraded health from preservatives. It has also lead to food shortages. Gamma rays could easily have been used widely to kill bacteria in our food but public fear... It is just energetic light but...

That is just one example, but there are hundreds.

The references to The Terminator are all over the web...loons.
Noodle_Naut
1.2 / 5 (6) May 02, 2013
The sanitizing of war by the removal of certain weapons just empowers politicians to exert their will over others using violence with far less risk to themselves. They want war to be clean and easy for the strong to stomp the weak.

War is ugly and should be avoided, not prettied up to enable it as an option to control others.

We need to realize that war needs to be removed from the options of politicians, we are too advanced, and must grow beyond this, the dangers are simply too great.

We have to tell our politicians not to spend even 10% of what we are spending on military. They were complaining that cuts might make it impossible to have 3 simultaneous far flung wars at the same time. Give them the tools and they will find a use. "Defense" beyond what is needed to defend our land is Offense by default. We need to cut "Offense" spending.

Just talking about the US primarily.

Skepticus
1 / 5 (2) May 02, 2013
...that death-dealing autonomous weapons...would not act out of revenge, panic, anger, spite, prejudice or fear... Robots also do not rape."

Lest ignore the ethics for the moment, and concentrate robotics capabilities.
Just over 30 years ago, the PC was invented. Then the Internet. Today, almost of us can't live without them. Our lives, ways of doing business, communication, and our privacy have changed forever. And yet the descendants of the unimaginative ilk of the 80's are willfully ignoring the precedences, objecting to looking ahead, and now using both to sneer at future dangerous possibilities?
Humans judgements are "programed" by acquired ethics, conventions and codes of conduct. Having an autonomous robot advanced enough, they can be sordidly programed to do all that mentioned above.
Imagine an endowed Boston Dynamics Big Dog Ver.6.9X or such knocking down females "enemy combatants" and doggy them. It won't be a pretty sight.!
TheKnowItAll
1 / 5 (2) May 03, 2013
Ooof I'm glad I didn't press [ENTER] of the comment I just wrote lol
Anyway just like guns don't kill people robots don't kill people whether autonomous or not.
Whatever I say here has very little weight but still for the few that might be interested; The real problem is that the mass has become a big kinder garden mob that just wines and cries through channels that lead nowhere and don't actually grow balls to do anything serious. So what do I think about autonomous robots? I think nothing because that is not the root of any problem we're actually facing. The I in IPhone is the problem and until that is rectified we will continue to live in a selfish world where" I" dictates, which means big corporations telling you what you can and cannot do by influencing you and the law. Yes that makes "big corporations" your adulthood mommy and the government corp. your daddy.
TheKnowItAll
1 / 5 (2) May 03, 2013
I should have probably not pressed [ENTER] on that one either :D
Aloken
1 / 5 (1) May 03, 2013
Why should we ban the one thing that can turn warfare into zero casuality events? Not at first, but as killer robots advance other nations will develop and build them then we can have robots fight wars instead of our youth. Whats so bad about that?
VendicarE
2 / 5 (8) May 03, 2013
The carrying capacity of quadrocopters is such that they can deliver a stick of dynamite to any American Republican Senator you like.

If they want war, bring them war.

The time for Revolution grows near.

antialias_physorg
3.7 / 5 (3) May 03, 2013
Not at first, but as killer robots advance other nations will develop and build them then we can have robots fight wars instead of our youth.

Because robot will certainly not balk at killing humans (soldiers or civilians). And if they're made to avoid hurting humans then someone fighting a war would use humans to attack robots.

Also with robots being more hardy it will mean that they'll have to up the destructiveness of the weapons. They're robots - so there's less of a hurdle to use nukes, or materially destructive agents (rubber eating bacteria, corrosive agents, etc.) which damage the ecosphere. The fallout for innocent bystanders (in terms of direct deaths or indirect casualties) will certainly increase.

Remember: most humans killed in war aren't soldiers. (E.g. in Iraq the ratio was about 30 civilians killed per dead soldier)
Protecting soldiers at the cost of getting more civilian casualties isn't worth it.

Let those die who are paid to fight/die.
Noumenon
1.5 / 5 (26) May 04, 2013
Improved military technology has lead to more precise and thus more clinical war operations, not less. It makes as much sense "banning" instruments of war as it does simply banning war itself.

Obviously, the above is in reference to the USA use of drones, not sci-fi. No matter what the USA does the UN will find away of being against it. They're nothing more than a group of useless morons.
Noumenon
1.3 / 5 (27) May 04, 2013
The carrying capacity of quadrocopters is such that they can deliver a stick of dynamite to any American Republican Senator you like.

If they want war, bring them war.

The time for Revolution grows near. - VendicarE


LOL, isn't delivering a stick of dynamite to those whom you disagree, a form of war? So magnify your knee-jerk response to the scale of countries or large organized forces, and you're right back to the cause for war,.... it's just more complicated and less understandable to peta'esque bedwetting "anti-war" liberals. It makes me laugh that people are so "against war",... no shit, war is bad, obviously.

And as far as "revolution" goes, clowns like you have zero power. Like the OWS dolts, with zero point and zero effect on anything. That was pure comedy to guys like me.
geokstr
2.3 / 5 (9) May 04, 2013
...even reports of deliberately targeting rescue efforts.


Since the US is the only country currently with the technology to deploy drones, this is obviously a statement critical of the US.

But who are these "reports" coming from? The same Islamists who are famous for making up huge numbers of dead civilians, especially children, and staging videos to prove all the "casualties"? Some have even blamed the US or the Israelis for dead children killed by the Islamists themselves. When these "reports" get thoroughly debunked, the "unbiased", "objective" "news" media ignore them, because it doesn't fit the narrative anymore.
Humpty
1 / 5 (4) May 04, 2013
A copy paste comment....

"We already have (single use) technologies that can automatically select their own targets. Have had for years.

I'm thinking of the spectrum of missiles, (e.g. anti radar-installation, anti tank, anti-aircraft, anti-missile, etc), mines (sea and land), and so on.

About the only difference is that "killer robots" presumably have is a shorter loiter time than your average WWII mine, and if we're lucky they might struggle if there are stairs / pylons in their way."
powerup1
not rated yet May 06, 2013
@powerup1: Really? And what then is the "reality" of the situation. And while you're at it, what is the reality of the situation to be in 20 yrs and how are this report's findings erroneous?


If you mention "killer robots" to most people they will think of something that they have seen in the movies ( it is almost a guarantee that if you see an article about robots like Petman or Bigdog from Boston Dynamics someone will make a comment about "Skynet" or some such thing. ) or read in sci-fi book.

The use of the phrase "Killer robots" is cheap and used to insight an emotional response and not to elicit a rational discussion about the issue.
antialias_physorg
5 / 5 (1) May 06, 2013
Improved military technology has lead to more precise and thus more clinical war operations, not less.

I dunno. The numbers I find look somewhat different.

WWII: ratio of soldiers to civilians killed: about 1:3
Vietnam: ratio of soldiers to civilians killed: somewhere between 1:2 and 1:3 (because it is hard to distinguish combatants from non-copmbatants)
Korea: ratio of soldiers to civilians killed: about 1:3 (hard to estimate since Chinese figures are a bit ambiguous)
Iraq: ratio of soldiers to civilians killed: about 1:30

Doesn't sound all that much more 'clinical' to me the further along the timeline you go.
Gawad
not rated yet May 07, 2013
@powerup1: O.k....but you're basically just repeating yourself. Yes, writing "killer robots" is a cheap way to grab attention for your headline. Yes, it evokes a number of sci-fi images that (for now) are exactly just that: sci-fi. But saying so says little about the "reality of the situation" which you still skirt (unless that was just a cheap way to do a drive by post). So what is the "reality of the situation"? Are you just objecting to the this article's title? That we're not there yet? To the report the article is based on? Why?
Gawad
not rated yet May 07, 2013
I dunno. The numbers I find look somewhat different.


Indeed. And the fact is that you and Numie could argue for a week or more about the exact number, and get nowhere because of who's counting and why. Fact is that is just looks like a wash, and drones haven't demonstrated a particular ability to spare "non-combatants". And I don't think how this would impact the number of non-combatant casualties once in a conflict is even the main concern. I can imagine lots of reasons why one wouldn't want to be up against an autonomous machine opponent. They can all be summed up as involving the gross disadvantage of having an opponent that does not care about it's own defeat, injury or death. But the deeper problem, IMO, is that "war is hell" on both combatants and civilians on all sides alike. And (much like the use of airstrikes) the replacement of soldiers with robots (especially if on only one side) removes a deep disincentive against engaging in war itself.
Dennis_Nilsson
not rated yet May 07, 2013
When the autonomous robot hunt's you and your relatives, it's to late to change your mind.
Aloken
not rated yet May 17, 2013
Because robot will certainly not balk at killing humans (...). And if they're made to avoid hurting humans then someone fighting a war would use humans to attack robots.

Disguising soldiers as civilians is how guerrilla warfare works and it's a huge cause of civilian casualties. I don't think anything would change there.

They're robots - so there's less of a hurdle to use nukes, or materially destructive agents (...) which damage the ecosphere. The fallout for innocent bystanders (...) will certainly increase.

Let's not forget that for now we're talking about flying drones, not bipedal terminator style robotic soldiers. It would be quite silly to attemp to use bacteria or corrosive agents against flying targets. There is also no need to increase destructive potential of weapons since all we're doing is replacing the human in the cockpit with a controller board. Then we have poor accuracy which causes friendly fire and accidental deaths, who's more likely to miss, drones?
Manhar
not rated yet May 21, 2013
There are 1000 times more human killers in the world than mechanical killer robot. Why UN human right group not making an agenda to stop them?
antialias_physorg
5 / 5 (2) May 21, 2013
There are 1000 times more human killers in the world than mechanical killer robot. Why UN human right group not making an agenda to stop them?

Erm...because human rights groups do nothing BUT trying to stop these human killers? You might want to check out a few before making such an unfounded statement.
TheGhostofOtto1923
1.7 / 5 (6) May 21, 2013
Why UN human right group not making an agenda to stop them?

Erm...because human rights groups do nothing BUT trying to stop these human killers? You might want to check out a few before making such an unfounded statement.
Except that usually the only way to stop them is to kill them. Before they kill you. Because that is exactly what they will do unless they are stopped.

What do they have to lose? Their families are starving.

I started reading the latest book from dan brown, Inferno
http://en.wikiped...n_novel)

-Interestingly it is about the disease of overpopulation and one very clever way of treating it. But abortion already serves this Purpose no? Sadly it is just not enough.

ONE BILLION ABORTIONS over the last 100 years and their descendants to 3 and 4 generations, is not ENOUGH.

The 100s of millions of pregnancies prevented through contraception, is NOT_ENOUGH.