Ban 'killer robots,' rights group urges

Nov 19, 2012 by Sebastian Smith
The US Navy's Northrop Grumman X47B, a demonstration unmanned combat air vehicle (UCAV), is seen on display in July 2012 at Naval Air Station Patuxent River, Maryland. Hollywood-style robots able to shoot people without permission from their human handlers are a real possibility and must be banned before governments start deploying them, Human Rights Watch warned Monday.

Hollywood-style robots able to shoot people without permission from their human handlers are a real possibility and must be banned before governments start deploying them, campaigners warned Monday.

The report "Losing Humanity"—issued by Human Rights Watch and Harvard Law School's International Human Rights Clinic—raised the alarm over the ethics of the looming technology.

Calling them "killer robots," the report urged "an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons."

The US military already leads the way in military robots, notably the unmanned aircraft or drones used for surveillance or attacks over Pakistan, Afghanistan, Yemen and elsewhere. But these are controlled by human operators in ground bases and are not able to kill without authorization.

Fully autonomous robots that decide for themselves when to fire could be developed within 20 to 30 years, or "even sooner," the 50-page report said, adding that weapon systems that require little human intervention already exist.

Raytheon's Phalanx gun system, deployed on US Navy ships, can search for enemy fire and destroy incoming projectiles all by itself. The X47B is a plane-sized drone able to take off and land on aircraft carriers without a pilot and even refuel in the air.

Perhaps closest to the Terminator-type killing machine portrayed in Arnold Schwarzenegger's action films is a Samsung sentry robot already being used in South Korea, with the ability to spot unusual activity, talk to intruders and, when authorized by a human controller, shoot them.

Fully autonomous fighting machines would spare human troops from dangerous situations. The downside, though, is that robots would then be left to make nuanced decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.

"A number of governments, including the United States, are very excited about moving in this direction, very excited about taking the soldier off the battlefield and putting machines on the battlefield and thereby lowering casualties," said Steve Goose, arms division director at Human Rights Watch.

While Goose said "killer robots" do not exist as yet, he warned of precursors and added that the best way to forestall an ethical nightmare is a "preemptive, comprehensive prohibition on the development or production of these systems."

Jody Williams, the 1997 Nobel Peace Prize laureate, said in Washington that the prospect of killer robots "totally freaked me out."

"I had visions of the Terminator," she said. "The thought that this development was proceeding without any public discussion I found more reprehensible than most military R&D because I really believe that this would... totally transform the face of warfare."

The problem with handing over decision-making power to even the most sophisticated robots is that there would be no clear way of making anyone answer for the inevitable mistakes, said Noel Sharkey, professor of robotics at University of Sheffield.

"If a robot goes wrong, who's accountable? It certainly won't be the robot," he said.

"The robot could take a bullet in its computer and go berserk. So there's no way of really determining who's accountable and that's very important for the laws of war."

Explore further: Robots recognize humans in disaster environments

add to favorites email to friend print save as pdf

Related Stories

What will the next 50 years bring in robotics research?

Apr 24, 2007

Would a conscious robot need the same rights as a human being? Could robots one day take over the care of our ageing population? Will robots be our soldiers of the future? When will robots be able to do all the housework?

Futuristic robots, friend or foe?

Apr 22, 2008

A leading robotics expert will outline some of the ethical pitfalls of near-future robots to a Parliamentary group today at the House of Commons. Professor Noel Sharkey from the University of Sheffield will explain that robots ...

Call to boycott killer robots

Aug 21, 2012

(Phys.org) -- Engineers should stop working on killer robots and kick the habit of military funding, a leading Australian applied ethicist has said.

Asimov's robots live on twenty years after his death

Apr 09, 2012

Renowned author Isaac Asimov died 20 years ago today. Although he wrote more than 500 books, the robot stories he began writing at age 19 are possibly his greatest accomplishment. They have become the starting ...

Recommended for you

Microsoft beefs up security protection in Windows 10

14 hours ago

What Microsoft users in business care deeply about—-a system architecture that supports efforts to get their work done efficiently; a work-centric menu to quickly access projects rather than weather readings ...

US official: Auto safety agency under review

Oct 24, 2014

Transportation officials are reviewing the "safety culture" of the U.S. agency that oversees auto recalls, a senior Obama administration official said Friday. The National Highway Traffic Safety Administration has been criticized ...

Out-of-patience investors sell off Amazon

Oct 24, 2014

Amazon has long acted like an ideal customer on its own website: a freewheeling big spender with no worries about balancing a checkbook. Investors confident in founder and CEO Jeff Bezos' invest-and-expand ...

Ebola.com domain sold for big payout

Oct 24, 2014

The owners of the website Ebola.com have scored a big payday with the outbreak of the epidemic, selling the domain for more than $200,000 in cash and stock.

Hacker gets prison for cyberattack stealing $9.4M

Oct 24, 2014

An Estonian man who pleaded guilty to orchestrating a 2008 cyberattack on a credit card processing company that enabled hackers to steal $9.4 million has been sentenced to 11 years in prison by a federal judge in Atlanta.

Magic Leap moves beyond older lines of VR

Oct 24, 2014

Two messages from Magic Leap: Most of us know that a world with dragons and unicorns, elves and fairies is just a better world. The other message: Technology can be mindboggingly awesome. When the two ...

User comments : 53

Adjust slider to filter visible comments by rank

Display comments: newest first

antialias_physorg
2.7 / 5 (13) Nov 19, 2012
The report "Losing Humanity"—co-produced by Harvard Law School's International Human Rights Clinic—also raises the alarm over the ethics of the looming technology.

I hate to be picky: but humanity was lost way before robots could take a shot on their own.

Humanity was already lost the moment someone thought about killing someone else.
Just_some_guy
1.6 / 5 (7) Nov 19, 2012
Humanity was already lost the moment someone thought about killing someone else.

Sad, but true!
LariAnn
2.4 / 5 (10) Nov 19, 2012
IMO, the military won't be able to resist the desire to develop autonomous robots, regardless of whatever laws are passed. Many laws protecting civilians are violated routinely in wartime so what's the big deal about violating just one more? The real trouble will start when autonomous robots begin to reprogram themselves based on real-time learning algorithms and then reevaluate what an "enemy" is. Once that happens, we may be on the way to the "Terminator" scenario.

Then again, I'm afraid that Human Rights Watch may be a day late and a dollar short. Whatever the military is showing us now is about 40 years behind what they have actually developed. So by that reasoning, they already have autonomous robots being field-tested in clandestine locations. By the time we know about them publicly, the military will have thousands of them ready to deploy.
lengould100
2.2 / 5 (13) Nov 19, 2012
One additional factor not discussed. The only limiting factor on governments use of military power to achieve ends, is their populations' refusal to accept military casualties. (US in Vietnam war is good example). Give governments robotic soldiers, and what might then happen?
Blakut
4.2 / 5 (6) Nov 19, 2012
Oh really, antialias, this means humanity never existed...
HTK
2.1 / 5 (15) Nov 19, 2012
It is the FUTURE.

It cannot, should not, will not be stopped.

It is a natural part of man kind's advancement into the future and beyond.
moebiex
2.9 / 5 (8) Nov 19, 2012
How about if they were tasked only to destroy weapons and not harm the fighters- would that not avoid most of the ethical issues? Rapid removal of guns, rockets, RPG's and even transports from both sides would allow for fast methodical de-escalation of many conflict situations and allow legal types access to enforce rule of law.
Strap
2.4 / 5 (9) Nov 19, 2012
As advancement goes, in 20-30 years, I would trust a robot to make a quick judgment call on the battle field before most humans. Good programming and engineering safeguards will be to blame when the technology makes mistakes. As it stands now, we let people with damage to their brains go berserk and kill a village of women and kids. No programming or engineering can safeguard against that.

I would go as far to argue that autonomous Doctors, Police, Soldiers, and even Teachers or Nannies will be better then their human counter-parts. Autonomous robotics maybe very well save many innocent lives from unstable or unable humans.
Joker23
2.4 / 5 (15) Nov 19, 2012
The mental capacity of the recent Nobel Peace Prize recipient "freaked out" is a clue to the mentality of these people. Yassar Arafat, if you recall, was a recipient of the "PeacePrize" ..great choice......spare me the platitudes of a group of ultra left wing academics That haven't a clue about the world other than through a book funded by some other left wing, anti war group. What pray tell has this kind of nonsense have to do with physics? Don't go the route of "Scientific American" hi jacked by a group of whiney anti everything editors reducing a good magazine to little more than Popular Science..........not popular...not science ..........
TheKnowItAll
1.6 / 5 (7) Nov 20, 2012
Whether we lost humanity or never had it is of personal opinion and I am sure that it does not reflect the majority's opinions. It certainly doesn't mean we shouldn't thrive for it as a whole. Evolution is more than just advancement in software and robotics. We must evolve in every aspect as equally as possible as to not end up diminishing ourselves. I think we should be concerned about this possible invention in progress and we should question it and understand its ramifications as fully as possible. We must question the source and discover what they are really up to. If it proves to be true then we can thank the ones who brought it to our attention and from there we can start thinking about ethics. We do all have a say in what happens to us, it's up to every single one of us to claim that right by saying something constructive. You are all curious and intelligent enough to be reading such a journal so I have no doubt that your final inferences will be noble. :)
Caliban
3 / 5 (6) Nov 20, 2012
Fully autonomous fighting machines would spare human troops from dangerous situations. The downside, though, is that robots would then be left to make nuanced decisions on their own, the most fraught being the need to distinguish between civilians and combatants in a war zone.
"A number of governments, including the United States, are very excited about moving in this direction, very excited about taking the soldier off the battlefield and putting machines on the battlefield and thereby lowering casualties," said Steve Goose, arms division director at Human Rights Watch.


Bollocks.

The real reason for the development of these machines is two-fold:

First, they represent avery large and ongoing investment of Taxpayer dollars to the military contractors producing them(Military-Industrial Complex, anyone?). To this could easily be added the Financial Industry(Raytheon shares, e.g.) to complete the more nuanced picture, or the Military-Financial-Industrial Complex.
contd.
Caliban
3 / 5 (6) Nov 20, 2012
contd.

Second, there is no chance that these machines will be made completely autonomous, except for as part of a "MAD" type end game final instruction.
To the contrary, these machines will be made to operate with commands that will include explicit orders to eradicate ALL humans in any given area --be they armed combatants or grannies in rocking chairs.

Why?

So that they me be turned upon the citizenry of the very countries they were built to "defend", of course.

In case you haven't noticed, we are way past the point of fighting any "Good War".

Wars are now fought to control access to resources. Unfortunately, these resources are very inequitably distributed.

Thus the need for conscienceless "combat" robots that kill on command.



antialias_physorg
3 / 5 (6) Nov 20, 2012
Oh really, antialias, this means humanity never existed...

Not in some people. No.

Whether we lost humanity or never had it is of personal opinion and I am sure that it does not reflect the majority's opinions.

Since the majority are those who suffer in war, I'd bet you're wrong.

It cannot, should not, will not be stopped.

I don't know about the 'should not'. But on the rest I, unfortunately, have to agree. There's no way that this will NOT become a reality. I just hope someone hacks the first nuclear armed drones and returns them to the sender (preferrably the seat of government). THAT would engender a ban in no time.
Oysteroid
1.4 / 5 (9) Nov 20, 2012
But we have actually had the autonomous killing machines (robots) ever since the first traps were invented. Take a crossbow, hide it in the bushes, make it fire by trip wire across the path - bingo! A fully autonomous killing machine. Low tech, sure. But as a matter of principle - we are just talking technology and tools here. Replace crossbow with a gatling gun and trip wire with a laser detector/sights. Put it on wheels for easy maneurability.... there you go.
EBENEZR
1 / 5 (5) Nov 20, 2012
I hate to be picky: but humanity was lost way before robots could take a shot on their own.

Humanity was already lost the moment someone thought about killing someone else.


Why stop at humans? I'm sure we're not the only animals to have premeditated a lethal attack.
TheKnowItAll
1.8 / 5 (9) Nov 20, 2012
Come one now. Have more pride in yourself and don't compare yourself to savages of the past or even worst to animals. Our goal is to rise above that and we shall, so don't be the one left behind. We did not evolve with a more complex brain for no good reason and we don't rule this Earth for no good reason either. Let's not regress but rather progress. What people did in the past is what they did and shall not be what we do now as we have evolved. Be a person of this era and say what you are and don't compare yourself with your primitive ancestors. Be proud of what we've become and fanaticize about what we will become.
philw1776
1 / 5 (4) Nov 20, 2012
Jody Williams, the 1997 Nobel Peace Prize laureate, said in Washington that the prospect of killer robots "totally freaked me out."

Apparently the US Nobel Peace prize winner likes his robotic drones, much more than even his predecessor.
Szkeptik
3.3 / 5 (6) Nov 20, 2012
I think the Human Rights Watch has it all backwards.
Robots don't have feelings, so most warcrimes could be prevented by employing them instead of soldiers.
A robot can't feel anger, doesn't want revenge and certainly won't kill unarmed civilians just for the heck of it.

If anything Human rights groups should be funding these developments.
88HUX88
2.3 / 5 (3) Nov 20, 2012
ED-209's behaviour in Paul Verhoeven's documentary on this topic should be a salutary lesson, it's bad enough they are flying remote drones which dehumanizes the act of killing without actually dehumanizing the act of killing.
Ventilator
1 / 5 (2) Nov 20, 2012
The most effective way for a shoot-don't-shoot situation to play out safely is to be as well informed as possible. Warfighters seem to have the training to deal with this issue.

Robots, once they become trainable, are sentient. At that point, their individual choices become a source of concern for us.

Star Wars R2-D2 compared with IG-88, in essence. One is more likely than the other, long term, if we go this route.

R2 seems nice enough, but wow did he curse a great deal.
omatwankr
1 / 5 (2) Nov 20, 2012
http://librivox.o...heckley/
In Watchbird, the question "can machines solve human problems?" is answered with a resounding YES! But there may be a few unforeseen glitches.
antialias_physorg
3 / 5 (2) Nov 21, 2012
A robot can't feel anger, doesn't want revenge and certainly won't kill unarmed civilians just for the heck of it.

But it has a hard time telling a civilian from a non-civilian. Soldiers don't tend to advertise themselves.

So either it's programmed to "not shoot when in doubt" (which makes it easily fooled and basically useless on a field of battle where a disguised combattant can just walk up to it and disable it)...or "shoot when in doubt" which will rack up civilian casualties in no time.

The problem aren't the clear cut cases of "IFF on target is active or not". The porblem are the borderline cases. And war/combat exists of nothing but borderline cases (except in computer games).
TheGhostofOtto1923
3.2 / 5 (18) Nov 21, 2012
I hate to be picky: but humanity was lost way before robots could take a shot on their own.

Humanity was already lost the moment someone thought about killing someone else.
AA you're a human-hater -? No wait, since all animals kill to survive, you are against all of life. A dalek. Saberhagens berserker.

Humans are the only species with a chance of enduring without killing one another. All we have to do is learn to live within our means. Restrict growth. Prevent the damage to brains and bodies which compels us to want more than we need. Science can do this for us. We can do this for ourselves.

This is a much healthier outlook dont you think?

A US drone was recently shot at by Iran. If this plane was manned and had been shot down, we could now be at war. Robots offer to save lives and REDUCE the chance of conflict.
antialias_physorg
2.7 / 5 (3) Nov 21, 2012
AA you're a human-hater -?

Only those humans who don't have any humanity in them (e.g. those who think that killing is a viable option in anything but desparate self-defence).

Humans are the only species with a chance of enduring without killing one another.

I'm not sure if you're aware of the hundreds of millions of years of history in which a lot of species (all non-pedator species) have managed to not kill members of their own species - and still survive? Growth/sustainability is not the reason humans have gone to war for the past thousand years or so (only as PR like 'Lebensraum' - but that was never the real reason)

This is a much healthier outlook dont you think?

I agree. But it has no impact on the issues of war/killing (and hasn't had for over a millennium).

If this plane was manned and had been shot down, we could now be at war.

War is decided by vested interests. If such intersts don't exist then there would be no war.
dacarls
not rated yet Nov 21, 2012
Yes, autonomous robots are great- as long as you control where it works. Since it does not care, when it is after YOU, your wife and kids, then that's DIFFERENT, right? If you are the subject Mexican wetback, or a pregnant female slave carrying drugs across the border, then that's ok? But when the Mexican drug cartels control autonomous flying killer bots on the border targeting US citizens, then what? I say "Ban the Terminators".
TheGhostofOtto1923
3.3 / 5 (19) Nov 21, 2012
Only those humans who don't have any humanity in them (e.g. those who think that killing is a viable option in anything but desparate self-defence).
We learned how to eliminate those attritive elements which naturally kept our numbers in check. As a result we consistently exceeded the carrying capacity of our habitats. This made man the enemy of man, tribe the enemy of tribe. Life was constant desparate [sic] self-preservation.

We killed, and kill still, to keep our children from starving. How many times do I have to repeat the obvious before you acknowledge it?? We had no choice. We STILL have no choice. But we're close.
Lurker2358
1.7 / 5 (6) Nov 22, 2012
Wars are now fought to control access to resources. Unfortunately, these resources are very inequitably distributed.


That's laughable.

If we actually fight wars to "control resources" then we've done a horrible job of it in my life time, as the U.S. typically does nothing with the "resources" and then hands full control and sovereignty right back to the same groups who caused trouble in the first place.

If modern wars were actually fought to control resources, the battles and long term strategies would look a LOT more like a game of Starcraft than the idiotic CRAP we do with our weapons and soldiers now when we watch wars live on television.

if you want to control resources you have to:

1, eradicate opposition

2, Occupy land

We've done neither of those things in my life time, in any of the "policing" campaigns or other wars we've been involved in.

In fact, to this day, the west gives aid money to Pakistan and other nations that funnel it directly to terrorists.
Lurker2358
2 / 5 (8) Nov 22, 2012
We had no choice. We STILL have no choice. But we're close.


Completely ridiculous.

There's almost never been a war fought over a legitimate food need in the past 2500 to 3000 years.

Almost every war that's ever been fought was over some stupidity such as just plain greed of some noble or king, genocide just for the sake of it, or religion.

The two primary causes of war in contemporary times continues to be genocide (especially anti-semitism) and religion (especially Islam).

Up until the past 200 years, the amount of arable land, hunting grounds, and fisheries vastly outnumbered the needs of humans. People could just step into their back yard and shoot a deer (or buffalo,etc,) and viola, meat. Of course they had cattle, obviously, but you didn't always "need" to take a cow for meat because you could take the deer that was stealing your veggies.

In some ways, people actually work harder to live in the modern world than they did 200 years ago. In others, the reverse is true.
Eric_B
5 / 5 (2) Nov 22, 2012
I got a 1979 Pinto with GPS and 4G and solid rubber tires and side-mounted street sweeper automatic shotguns and rear mounted flame throwers and auto destruct AmPho packs and armor plating i welded myself and it has a caltrop dropper and even an oil slick when the engine blows and

are there any coders on here? i need help with the navigation and targeting software so i can get set to let her loose.
Eric_B
5 / 5 (3) Nov 22, 2012
and, i for one, welcome our robot overlords.
Mako10
not rated yet Nov 22, 2012
When I watched that a Russian gun fanatic on Youtube attached a mahine gun to his personal UAV I was horrified. I couldn't help but think that someday people will be murdered by psycho's flying "killer UAV's" from miles away.
FrankHerbert
1.7 / 5 (6) Nov 23, 2012
I'm okay with killer robots, if and only if liberals are in control of them.
antialias_physorg
3.7 / 5 (3) Nov 23, 2012
The two primary causes of war in contemporary times continues to be genocide (especially anti-semitism) and religion (especially Islam).

I'm not sure you have been following the news lately (i.e. any time in the last century). But wars have alrways been fought because of greed (e.g. cheap access to resources). Even in the 1400's religion (or nowadays religion or antisemitism) is just a cover to get the people behind the 'cause'. Even the crusades were fought for money - not religion.

However, it's hard to get people to get into a warring mood when you truly tell them: "I want you to go suffer and die so that my bankaccount gets fat"...but that's what it has always boiled down to. So you have to appeal to something other (and unthinking knee-jerk emotions, like religion, are perfect for that)
TheGhostofOtto1923
3.4 / 5 (17) Nov 23, 2012
greed of some noble or king, genocide just for the sake of it, or religion.
Well of course you would fall for that. Interesting though that as a religionist you would acknowledge that religion is at least as efficient an excuse for war as anything else, while we would expect that as the source of morality they would be least capable of it.
I'm not sure you have been following the news lately (i.e. any time in the last century)
The EQUATION - pops grow faster than the resources they need - is still VALID. You think that because it is never cited as the reason to fight, means it doesn't apply??

Look at gaza, Somalia, afghanistan. Pops double every 16yrs. Look at the throngs in the streets. They are ANGRY because there are TOO MANY of them.

These conditions have ALWAYS been inevitable. Predictable. Dependable. And each tech breakthrough exacerbates the problem. Given this, Leaders would have to be suicidal to allow wars to happen by themselves.
TheGhostofOtto1923
3.4 / 5 (18) Nov 23, 2012
1400's religion (or nowadays religion or antisemitism) is just a cover to get the people behind the 'cause'. Even the crusades were fought for money - not religion.
You acknowledge the unique facility of religion to compel the people to fight, and still not see that wars are Planned and Staged to protect and not imperil?

The People who conduct these Constructive wars know full well that wealth - gold, property, resources - would be worthless if the system which guaranteed their value and knew how to utilize them, were to collapse.

They consider Knowledge as wealth. The most valuable thing that civilization owns, is it's vast trove of knowledge. This has been lost in the past, and wildfire wars fueled by exploding populations are it's greatest threat.

And so wars must be waged in order to PROTECT it and ENHANCE it. And you and lurker and the throngs will be told whatever is necessary for you to accept it, and participate. Religions have always been most useful in doing t
mountain_team_guy
1 / 5 (5) Nov 23, 2012
Obviously, war has pushed the evolution of man's technology for thousands of years. Whether or not you believe technology has benefited man, we spend much less effort on the basic necessities. Along with increasing the lethality of weapons, technology has changed the shape of combat forces from massive dense infantry formations clashing hand to hand to vastly complex systems of organizations, platforms, and weapons. The forces engaging in actual combat are now dwarfed by the support networks stretching around the world. Civilians on the battlefield are a growing issue as the battlefield itself spreads with the increasing range of platforms and weapons systems. But one thing has held true since the advent of artillery, armor, air power, and naval forces. Everyone else's job is to support the infantryman. All the technology does nothing unless there is a soldier able to occupy territory. Therefore a guerrilla hiding within a civilian populace wins by denying that ability to his enemy.
mountain_team_guy
1 / 5 (6) Nov 23, 2012
The mission of robotic fighting systems on the battlefield will be to support the infantryman. The most powerful weapon on the battlefield is the human mind. Until robotic intelligence reaches par with human, robots may save lives performing routine tasks in support of human missions, but will lack survivability if tasked to occupy or patrol terrain independently. Even an uneducated human can thwart the best trained and equipped human with regularity on the modern battlefield. Regardless of how robots kill humans, human's will certainly fall under the guns of robotic systems because they will increase the survivability of the human operators. War is Hell, but any battle in history looked the same for it's losers. Trying to emasculate thinking fighting robots with international conventions is a joke. You might as well try to outlaw airplanes that drop bombs.
tadchem
not rated yet Nov 23, 2012
We should be at least as concerned about the availability of killer robots to NGOs such as Al Qaeda and the Aryan Nation.
zz6549
1.5 / 5 (2) Nov 23, 2012
The robot could take a bullet in its computer and go berserk


And they let this guy teach robotics? The whole idea of robots "going berserk" as a result of being damaged is solely in the domain of hollywood. A complex system that gets damaged will stop working..not start killing humans.

Anyways, it's important to be careful with giving robots the ability to autonomously attack human beings, but we shouldn't exclude the possibility.
Vendicar Dickarian
2 / 5 (4) Nov 23, 2012
Yes, autonomous robots are great- as long as you control where it works. Since it does not care, when it is after YOU, your wife and kids, then that's DIFFERENT, right? If you are the subject Mexican wetback, or a pregnant female slave carrying drugs across the border, then that's ok? But when the Mexican drug cartels control autonomous flying killer bots on the border targeting US citizens, then what? I say "Ban the Terminators".


Ah, yes....because "Banning...." would stop the cartels from building them. Where do they find you morons in the "ban it" crowd? The fact is, banning will solve literally nothing. Ban or don't ban -- one group or other that you don't like will make the damn things. You can't legislate morality, decent behavior, etc. On the other hand, I'm confident I can put my faith more readily in some groups than others, and the U.S. military would be right at the top of the "trusted" list.
Vendicar Dickarian
1 / 5 (3) Nov 23, 2012
When I watched that a Russian gun fanatic on Youtube attached a mahine gun to his personal UAV I was horrified. I couldn't help but think that someday people will be murdered by psycho's flying "killer UAV's" from miles away.


So? Millions are murdered daily by psycho's ten feet away. What difference could it possibly make to the deceased?
ValeriaT
1.8 / 5 (5) Nov 23, 2012
Is someone here, who just needs the war conflicts for something useful? Why not to ban all weapons with exception of weapons, which should serve for enforcing this law? It would eliminate the war conflicts soon.
Vendicar Dickarian
2 / 5 (4) Nov 23, 2012
...Why not to ban all weapons with exception of weapons....


Not sure I follow...

which should serve for enforcing this law? It would eliminate the war conflicts soon.


As I pointed out earlier, "banning" is a nonsensical waste of time and anyone with a modicum of understanding of human history can point out how it has utterly failed throughout human history. Banning does not, will not, and simply can not work with human beings. Those that would live peacefully may agree to your terms, but your enemies will not ever do so.

Additionally, if you were to ban every advanced weapon on earth this afternoon, and had some method of enforcing this option, the kings, sultans and presidents of the world would have their armies sharpening axes and spears by tomorrow morning. All of which is to say, quit wasting time with this utopian claptrap.
ValeriaT
1.7 / 5 (6) Nov 23, 2012
.."banning" is a nonsensical waste of time ...quit wasting time with this utopian claptrap
So do you recommend to ban the disarmament proposals instead?
mountain_team_guy
1.5 / 5 (8) Nov 23, 2012
As I remember, we banned weapons prior to WWII, such as battleships over a certain length and displacement. We didn't ban chemical weapons however, but kept stockpiles as a deterrent. Sure enough, the banned weapons were deployed against us at the start of the war. Chemical weapons remained shelved. Do you think human nature has magically evolved for the better?
kochevnik
1.6 / 5 (7) Nov 23, 2012
the U.S. military would be right at the top of the "trusted" list.
That is comedy. Everyone in Europe knows the Americans come in raining death and destruction in their fucking wars. Then the Europeans have to clean up and restore community relations because they're not all savage animals.
We should be at least as concerned about the availability of killer robots to NGOs such as Al Qaeda and the Aryan Nation.
Don't worry I'm sure the CIA is supplying then right now to give you something to do in 2022. Remember Hilary said that Ql Aaeda are terrorists everywhere but freedom fighters in Syria. Fucking utilitarian.
mondoblu
1 / 5 (2) Nov 24, 2012
Nuclear weapons, landmines and killer bots shall be banned forever, and people building such engine shall prosecuted for crimes against humanity.
TheGhostofOtto1923
3.4 / 5 (17) Nov 24, 2012
That is comedy. Everyone in Europe knows the Americans come in raining death and destruction in their fucking wars.
Yeah and I for one was pretty pissed off when Russia invaded Afghanistan (no I wasn't)
Then the Europeans have to clean up and restore community relations because they're not all savage animals.
Yes they have had 2 or 3 whole gens to evolve after those wars of theirs where millions died.

You are naive and enjoy posturing a little too much.
grondilu
not rated yet Nov 24, 2012
I've always thought there is something inherently weird in the idea of "regulating war". Somehow, they manage to do it but I wonder how serious this is. When people start to kill one another, it's not a game anymore, so there can't be any "rule". Everything goes. Pretending you can regulate this is just hypocrite, imho.
TheGhostofOtto1923
3.2 / 5 (18) Nov 24, 2012
I've always thought there is something inherently weird in the idea of "regulating war". Somehow, they manage to do it but I wonder how serious this is. When people start to kill one another, it's not a game anymore, so there can't be any "rule". Everything goes. Pretending you can regulate this is just hypocrite, imho.
Its an illusion. Wars always happen on Time and according to Plan. If it is possible to Manage war and predetermine the outcome then to refrain from doing so would be immoral and suicidal. But it IS possible and so that is exactly what is done.
Eric_B
5 / 5 (1) Nov 24, 2012
"fas789 Nov 23, 2012 Rank: 1 / 5 (2) as Adam implied I'm shocked that you able to earn $7328 in one month on the internet. did you look at this page Cloud68.com"

bots that KILL SPAMMERS...PLEASE?
agdido
not rated yet Dec 27, 2012
Simple and easy: The consequences of this will be catastrophic but why care now if we have been doing the same since somebody decided that in name of efficiency replacing humans in every possible acctivity was cool...or tell me why are so many competitions/games developing robots...who is behind... what's the interest...at the end will be fair for a race of idiots that plan their own obsolence and replacement...Bye to the carbon computer...
TheGhostofOtto1923
3.7 / 5 (15) Dec 27, 2012
Simple and easy: The consequences of this will be catastrophic but why care now if we have been doing the same since somebody decided that in name of efficiency replacing humans in every possible acctivity was cool...or tell me why are so many competitions/games developing robots...who is behind... what's the interest...at the end will be fair for a race of idiots that plan their own obsolence and replacement...Bye to the carbon computer...
In the future only robots will have guns because people by that time will trust them much more than their fellow humans... with just about everything. You know because they will be DESIGNED to be more dependable and trustworthy.

So humans can just sit back and devolve like the eloi.