Kubrick's AI nightmare, 50 years later

April 13, 2018, University of Western Ontario
Credit: University of Western Ontario

As David Bowman – the surviving crew member aboard the Discovery One spacecraft in Stanley Kubrick's 2001: A Space Odyssey – disassembles HAL 9000, the sentient computer pleads in an affectless, monotone voice:

"I'm afraid, Dave."

"Dave, my mind is going. I can feel it."

As HAL's consciousness – or rather, his logic – fades, he dies singing Daisy Bell, the first song 'sung' by a real-world computer. With the threat removed, all is seemingly right again.

Celebrating its 50th anniversary this month, Kubrick's masterpiece has cast a shadow over the genre since its premiere. Its influence extends beyond depictions of space and space travel, touching more than Star Wars, Alien or Blade Runner.

For example, its effect on our vision of artificial intelligence (AI) is palpable.

Think of Amazon's Alexa, who, like HAL, listens to whatever you say.

But now, five decades later, have we evolved past Kubrick's nightmare of a sentient, threatening machine? How has our understanding of, and relationship to, AI changed? Do we have a reason to fear the machines we program?

For Catherine Stinson, who recently completed a postdoctoral fellowship at Western's Rotman Institute of Philosophy, Kubrick's vision, while much different from the present state of AI, is still a looming threat. The threat, however, is not the machine.

"People thought about AI a lot differently back then, the danger being it was going to be an agent who would act differently than us, with different priorities than what we have," she said.

"That is less the worry now. It's not going to be the one-on-one interactions (with a sentient machine) that we don't know how to deal with. It's going to be something we've put all our evil into, and now it's off doing things that are an extension of the problems of humans – but on a grander scale we couldn't have imagined. It's not so much machines are different from us – it's they are reproducing the problems of humans."

Part of the issue, Stinson explained, is humans are the ones programming AI. How can humans program ethical machines? Can machines be ethical? We see ourselves as being competent in making ethical decisions because we decide between right and wrong on a regular basis, she said. We rely on an instinct we know right from wrong in day-to-day situations.

"But in more complicated situations that come up – like self-driving cars – it's really difficult, even for someone who does have training in ethics, to design what the right thing to build into it is," Stinson noted.

For instance, should the car avoid crashing into a pedestrian, even if it is going to lead to the death of the driver? How do you weigh the two different lives at risk? Do you program the car to save the occupants of the vehicle or those with whom it might collide?

"I don't know how to make that kind of decision. I don't know that that decision is something the average person knows how to make. It needs a more careful approach and someone with more expertise needs to be involved. But it's hard to see that there is that need, because everyone thinks they are an expert," Stinson added.

Individuals taking engineering and technology courses should be trained in ethics, she added. Barring that, companies working in AI could benefit from an in-house ethicist. Academic institutions are increasingly requiring engineers and computer scientists to take courses that touch on the subject. Although the question of 'ethical machines' is up for debate, the simple fact we can program them to perform acts that are right or wrong involves them in an "ethical game," Stinson said.

"Maybe we could program a machine to do the right thing more often than we would. But is there reason to fear? Sure. There are machines being used in the justice system in the United States, making decisions that maybe aren't the right ones. We're not sure how they are making those decisions and there's no accountability to whose fault it is if they make the wrong decision," she noted.

For sentencing in particular, there are AI programs that help judges decide on what the right sentence should be for someone convicted of a crime. The algorithm is designed to make sentencing less biased by taking into account factors from the person's past, what kind of neighbourhood they grew up in, what kind of people they knew, prior arrests, age of first involvement with police, etc.

All of those things are not neutral pieces of information, Stinson said. Such AI programs have been criticized for reinforcing the stereotypes they were designed to avoid.

"We don't know what the dangers are. Part of worrying about the dangers is trying to predict what those might be, and to decide on what we value, and what kind of things we want to have happen, for the sake of convenience," Stinson said.

Tim Blackmore, a professor in the Faculty of Information and Media Studies, has taught 2001: A Space Odyssey to students for more than a decade. He echoed Stinson, noting the dangers of AI lie in the human element at play. For him, whatever form it takes in films or books, AI has always been an extension of the human.

"Thinking machines are often portrayed as cognisant of their own existence and aware of existential issues. They are one of the many mirrors humans use to reflect what it is to be human," Blackmore said.

And that's the nightmare.

"Until now, it's been a 'machine that rules the world' kind of nightmare. That comes out of the 1960s and is shaped very much by Vietnam, as well as the idea these mainframes, these big machines, were part of a worldview that was running us into an inhuman, determinist way of living that would lead to genocides," he explained.

But the threat today lies not in our vision of AI as some machine from the future that can outperform or conquer us.

"We much less imagine WALL-E – the helper machine. But that's much more it. It's not the machines that are a problem; it's the humans. People do bad things," Blackmore noted, adding he is nervous about the "helper" we blindly embrace.

"I'm worried about these disks and cylinders or whatever Amazon, Google or Facebook want to jam into our home next. People want this; it's a gadget and it's cool because it's so hard to pick up your mobile phone and type something into it or speak into it. We're going into the trough and we suck that stuff up, and then we're going to have terabytes of data flying into pools where they could be scrubbed for everything. That data can be manipulated by AI agents who will be better and better at looking for how to game beings," he continued.

"How this technology will develop so people can push people around – that is what tends to be bad news for us. The robot uprising is lower on my list."

Explore further: Opinion: AI like HAL 9000 can never exist because real emotions aren't programmable

Related Stories

The everyday ethical challenges of self-driving cars

March 27, 2018

A lot of discussion and ethical thought about self-driving cars have focused on tragic dilemmas, like hypotheticals in which a car has to decide whether to run over a group of schoolchildren or plunge off a cliff, killing ...

Can we teach robots right from wrong?

October 14, 2014

From performing surgery and flying planes to babysitting kids and driving cars, today's robots can do it all. With chatbots such as Eugene Goostman recently being hailed as "passing" the Turing test, it appears robots are ...

Recommended for you

Google braces for huge EU fine over Android

July 18, 2018

Google prepared Wednesday to be hit with huge EU fine for freezing out rivals of its Android mobile phone system in a ruling that could spark new tensions between Brussels and Washington.

EU set to fine Google billions over Android: sources

July 17, 2018

The EU is set to fine US internet giant Google several billion euros this week for freezing out rivals of its Android mobile phone system, sources said, in a ruling that risks fresh tensions with Washington.

38 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Mark Thomas
5 / 5 (4) Apr 13, 2018
How can humans program ethical machines?


First, humans have to want that outcome. Profit is the motivator for building these machines, so ethics may be at best an afterthought, and at worst, counterproductive. An effective terminator would be worth a hell of a lot of money. There are plenty of people more than willing to turn an army of them loose on their enemies. Ever weakening technological barriers, especially power sources, are why we don't have terminators now. Watch the Boston Dynamics videos if you think it is impossible.

"How this technology will develop so people can push people around – that is what tends to be bad news for us. The robot uprising is lower on my list."

Exactly. The logical approach would be to ban weaponizing these systems, but that is not where the money is. Like so many other things, we will probably have to wait for the deaths to rise to an unacceptable level before any real agreements limiting their use can be forged.
Macrocompassion
1 / 5 (2) Apr 15, 2018
We know right from wrong due to eating from the fruit of the tree of right and wrong knowledge. If we can do it so can our AI devices, and they will not need to be banished from having a free lunch in the Garden of Eden.
TheGhostofOtto1923
3 / 5 (3) Apr 15, 2018
First, humans have to want that outcome. Profit is the motivator for building these machines, so ethics may be at best an afterthought
Right, as if profit is evil and capitalism is evil by extension and communism is the only moral alternative?

Is spacex evil? Are railroads and computers and insulin evil?
we will probably have to wait for the deaths to rise to an unacceptable level
Sure. And we can expect evil capitalists to invent products that routinely kill millions of consumers.

Liberal sloganeer thinking is so painfully 2 dimensional.
jimmybobber
4.5 / 5 (8) Apr 15, 2018
TheGhostOtto1923 you were pretty rude to Mark Thomas. I agree with his post. Mark doesn't come off as anti-capitalist or even liberal. He comes off as educated and logical. You brought politics into the discussion which derails it.
TheGhostofOtto1923
2.3 / 5 (3) Apr 15, 2018
TheGhostOtto1923 you were pretty rude to Mark Thomas. I agree with his post. Mark doesn't come off as anti-capitalist or even liberal. He comes off as educated and logical. You brought politics into the discussion which derails it
1) I know marky mark. You don't. 2) This:
Profit is the motivator for building these machines, so ethics may be at best an afterthought
-is distinctly anticapitalist and liberal. The fact that you dont recognize it, and mistake his veiled invective for informed logic, means you're a typical liberal dupe.

Sloganeering is not informed logic. It is apeshit.
KBK
5 / 5 (3) Apr 15, 2018
Almost as if Otto is being purposely 2 dimensional and black and white, so he can derail augments and take them down into insult wars. In that way nothing gets done, nothing gets discussed.

Those kinds people, you pick them up by the scruff of the neck and toss them out the nearest airlock.

Otto, bring your best balanced game to the table, or get lost. As in: you are not acting as if you are worth the time - anyone's time.

Mark is being more right then he is being wrong, and btw, I'm not the arbiter of that view, The given reader is.

History has taught us that the Machiavellian oligarchical are the key long term force over time in human society and culture. They occupy positions from the military, to the religious, to the political and most definitely occupy many positions in the capitalist world as it is, via it's nature--which is that of a near perfect fit.

Power and profit in a world with money makes psychopaths and capitalism a perfect fit. As evidenced.
jimmybobber
4.5 / 5 (8) Apr 15, 2018
Exactly KBK. Otto brought insult and politics into a scientific discussion and quickly ended it. I want to respond to Otto but I cannot.
Furthermore Otto. You seem really angry.
Mark Thomas
5 / 5 (4) Apr 15, 2018
1) I know marky mark. You don't.


LOL, not a chance! It is readily apparent to me that you are unable or unwilling to think through issues on your own, so you fall back on the deceptive liberal/conservative dichotomy to mislabel and insult.
TheGhostofOtto1923
not rated yet Apr 16, 2018
otto is being purposely 2 dimensional and black and white, so he can derail augments and take them down into insult wars
I dunno. I think that categorizing scientists who are working on this tech as bloodthirsty profiteers is pretty 2 dimensional and insulting, while I know that 2D libs like marky mark think it's objectively the high moral ground.

AI robots = terminators is the juvenile mindset. Ooh big dog is so scary! We must pass laws banning them and the guns they must carry!

In truth, robot soldiers can be imbued with the highest of human morals and can be expected to emulate them consistently and dependablly, unlike the typical human grunt who may be terrified, confused, furious, and wounded. For this reason robots are the only moral alternative.

But 2D libs see Schwarzenegger with a metallic endoskeleton and wet their pants. And conclude that anyone who would want to develop these things must be evil.
Otto is angry OMG
Naw just disgusted as usual.
TheGhostofOtto1923
not rated yet Apr 16, 2018
History has taught us that the Machiavellian oligarchical are the key long term force over time in human society and culture. They occupy positions from the military, to the religious, to the political and most definitely occupy many positions in the capitalist world as it is, via it's nature--which is that of a near perfect fit
I see... so capitalists have caused all the major wars of history, not fascists and religionists and despots. I didn't know.

Yeah I do feel the urge to insult this pompous ass but I shall restrain myself. 'Macheovellical oligeoarchism'... powerful stuff. Who can argue with big words like dat?
TheGhostofOtto1923
not rated yet Apr 16, 2018
BTW psychopaths are born, not made. They are the true terminators of society.
https://www.cassi...path.htm

https://youtu.be/ZKbZMIP4XUE
"Part man, part machine... Does not feel pity or remorse or fear.. And absolutely will not stop..."

-This is actually the best description of the psychopath yet.

Thanks.
antialias_physorg
5 / 5 (3) Apr 16, 2018
we will probably have to wait for the deaths to rise to an unacceptable level before any real agreements limiting their use can be forged.

The problem I see is that those who would profit from such machines (and those who they put in power) do no longer have a threshold for "number of acceptable deaths". As long as it's not their tropical island they couldn't care less if entire nations get snuffed out (and looking at Syria this seems to be rather demonstrable already - from both sides).

Killer-bots will only be outlawed once they start killing the rich. And with the rich having increasingly control over the wealth of the world they can insulate themselves for a long time until that happens.

(Maybe we should rethink how we wage wars. Let's not have soldiers/tanks/ships/planes/bombs but just dedicated infiltration/hit-teams and bots targeted exclusively at the rich and powerful. I bet we'd have a lot less conflict then.)
Mark Thomas
5 / 5 (1) Apr 16, 2018
those who would profit from such machines (and those who they put in power) do no longer have a threshold for "number of acceptable deaths"


Assuming for the moment they ever had such a threshold, and that is a mighty big assumption, then yes, I agree. However, I was thinking more along the lines of voters in various democracies demanding action.

I am becoming more worried that we are in deep trouble. We seem to be unable to control ourselves and let the morally bankrupt like Trump and Koch Brothers control the U.S. government. The U.S. parallels with the rise and fall of ancient Rome are scary. The Republican-controlled media has done a great job brainwashing the weak-minded like Otto into believing all problems have their source in those darn libs and not corruption.
Mark Thomas
5 / 5 (1) Apr 16, 2018
Otto: "robots are the only moral alternative."

My guess/hope is that you are just trying to be provocative because you are getting lonely in your parents' basement.

Here is one place where your logic breaks down. Just because it MIGHT be possible to make a moral terminator (that would be stretch), that does not mean it is "the only moral alternative."

"But 2D libs see Schwarzenegger with a metallic endoskeleton and wet their pants. And conclude that anyone who would want to develop these things must be evil."

Only a brain-washed Republican like Otto would sympathize with the Terminator and not its victims. Otto, you are going to find this shocking, but the director of those movies intended you to "wet your pants" and conclude the developer (Skynet) is actually evil. Perhaps you should re-watch the entire series with your eyes open this time. You might notice Skynet's goal always includes genocide of the entire human race.
434a
5 / 5 (2) Apr 16, 2018

Let's not have soldiers/tanks/ships/planes/bombs but just dedicated infiltration/hit-teams and bots targeted exclusively at the rich and powerful. I bet we'd have a lot less conflict then.


It all went to shit when we stopped expecting kings to lead armies into battles. Once they worked out they could do it by proxy, war became a whole lot more palatable proposition to them. I jest but only slightly.

I think the terminator scenario is inevitable. As history has proved there is no such thing as a secret when it comes to weapons technology. The most closely guarded of states secrets, the nuclear weapon, is now in the hands of what are deemed to be rogue states (though ask not who does the deeming). Russia knew in real time what the US was doing at Los Alamos. The same will go for AI and anyone that builds one capable of independent killing will have shared that knowledge with all the major (and some minor) powers unwittingly or no.
Mark Thomas
5 / 5 (1) Apr 16, 2018
I believe the solution is laughably naive and simple to state, but totally impossible to implement today in most places. The solution is to redesign all government to reflect the best interests of the governed, not the desires of the rich and powerful. For example, corporations should NOT be allowed to donate to political campaigns because they will naturally promote their own interests over the best interests of the governed. Global warming should be treated seriously to minimize its impact on the best interests of the governed. Teenagers should not be permitted to have assault weapons because that is not in the best interests of the governed. See, it is so simple, but most politicians would laugh their ass off if they heard it because that is NOT how the system works. The rich donors control the Republican Party and the further you stray from their mandates, the more likely you will fall from their favor. The public is easily deceived, just blame libs.
Mark Thomas
5 / 5 (1) Apr 16, 2018
Michio Kaku speculated that if aliens were sophisticated enough to reach us they would have worked to optimize all their social systems. If we assume for the moment he is right, shouldn't we do this too? What exactly is holding us back? I suggest it is the same two things that have always held us back, i.e., corruption and stupidity.

The more subtle question is whether it is possible to become an interstellar-traveling species without this optimization. How you answer that will impact what you believe you are likely to find out there. The galaxy may be a sparse junkyard of failed and stunted civilizations who could not overcome their own corruption and stupidity. Yet another reason to keep working to do better.
TheGhostofOtto1923
not rated yet Apr 16, 2018
The solution is to redesign all government to reflect the best interests of the governed, not the desires of the rich and powerful
Marky mark should keep in mind that while he and his starry-eyed buds may constitute the majority here, they are a distinct minority in the real world. And that the vast majority of us who voted resent being referred to as rich and powerful.
Teenagers should not be permitted to have assault weapons because that is not in the best interests of the governed
If someone is old enough to get married and start a family, he should be legally capable of protecting it. And what your smarter libs won't tell you is that pistols are functionally equivalent to 'assault rifles' shudder
See, it is so simple
Well that's because YOU are so simple Marky. That's obvious.
TheGhostofOtto1923
not rated yet Apr 16, 2018
brain-washed Republican like Otto would sympathize with the Terminator and not its victims
Markey obviously missed the sequal where Arnold came back as the good terminator, and the only thing capable of defeating the bad terminator. That's the way an arms race works Markey. If you don't win you die.
the director of those movies intended you to "wet your pants" and conclude the developer (Skynet) is actually evil
And you would genuinely be distressed to learn that 1) it was just a movie and 2) it was made by ultraliberals with political agendas.

Let me know if you have trouble with either of those 2 realities.
The most closely guarded of states secrets, the nuclear weapon, is now in the hands of what are deemed to be rogue states
But they are in imminent danger if they build them. And they will disappear if they ever use them.

Which is why the superpowers conspired to develop their overwhelming nuclear potential. Same will happen with AI.
Mark Thomas
5 / 5 (1) Apr 16, 2018
Markey obviously missed the sequal where Arnold came back as the good terminator, and the only thing capable of defeating the bad terminator.


OttoBotto, your brain rejects any information that does not fit with your political thinking. If you watched those movies at all you might have noticed that regular humans were dying in droves throughout. The Chinese have an expression, when whales fight, fish get hurt. Yes, they are just movies, but they are also shared culture and perhaps a cautionary tale. Are you trying to argue killer robots could not be built? Watch this video and imagine this robot with a machine gun and terminator head.

https://youtu.be/rVlhMGQgDkY

BTW, the first autonomous aircraft landing on an aircraft carrier was almost 5 years ago and that UAV could carry ~5,000 pounds of munitions. If it can't be built today, give it 10 years.
Mark Thomas
5 / 5 (1) Apr 16, 2018
434a: "I think the terminator scenario is inevitable."

If nothing is done to prevent it, then I think you are probably right. Once the technological barriers are overcome, we can only hope the rich who control the governments will conclude it is not in their self-interest to use them, but that is a pretty flimsy hope. It is going to be very hard for us to develop these weapons and refrain from using them if they could save our own military's lives in a hot war.
Mark Thomas
5 / 5 (1) Apr 16, 2018
Otto: "the vast majority of us who voted resent being referred to as rich and powerful."

LOL! You are not rich and powerful. You have one vote and probably squander it as Hannity and Fox News convinced you to. Did you ever notice that every position they take always ultimately favors the rich, especially their rich donors? Did you ever notice they never make a fairness argument for the middle class, only for some giant corporations?
Da Schneib
5 / 5 (1) Apr 16, 2018
The combination of propaganda and high technology is a deadly one, and may well lead to our extinction. Once propaganda can be used to allow bad actors to co-opt government, it is inevitable that high technology will be used by those bad actors to perpetuate their power.
snoosebaum
5 / 5 (1) Apr 16, 2018
" AI '' is an advertisement for stuff they want to sell us
Da Schneib
5 / 5 (1) Apr 16, 2018
Hidden racism and other types of cultural and religious bigotry guarantee that there will always be a fertile field for propaganda.

Democracy is a wonderful idea except that it assumes that everyone who votes can and will identify and reject propaganda even when it validates their feelings of bigotry and jingoism. There are two ways to do this:
1. Ban propaganda.
2. Get most people to identify and reject propaganda.
Neither has worked so far.
TheGhostofOtto1923
not rated yet Apr 16, 2018
LOL! You are not rich and powerful. You have one vote and probably squander it as Hannity and Fox News convinced you to
And like I say the rich and powerful did not vote this administration in. WE did.
Did you ever notice that every position they take always ultimately favors the rich, especially their rich donors? Did you ever notice they never make a fairness argument for the middle class, only for some giant corporations?
- Funny I thought the tax cuts favored the 90% of us poor plebes who are now paying less. I thought that ousting millions of cheap laborers benefited us poor plebes who need the work.

Funny I thought the libs always favored their rich and powerful backers like Hollywood, the Clintons, and mayor Bloomberg ($50 billion) by overtaxing us plebes and taking our guns away to make them feel safer with their armed guards and all. And also justify an enormous police state bureaucracy paid for with all our money.
TheGhostofOtto1923
not rated yet Apr 16, 2018
The middle class thrives under free market capitalism. It disappears under communism.
1. Ban propaganda.
Yeah I agree. That comey/Papadopoulos interview should never have been aired.
Are you trying to argue killer robots could not be built?
No I'm saying they WILL be built. Cheaply and easily. Swarms of drones and herds of armored bots. And so it behooves the west to build them first and best, in overwhelming numbers.

Taliban soldiers by our perspective are vicious and evil. So would be the bots they field. Our soldiers are by and large ethical and restrained by the rule of law. We can build bots on the same template with the confidence that they would consistently adhere to those laws.

And we can consistently increase this dependability based on field experience, rather than try to do the same with human soldiers through technological widgets and enhancements.
TheGhostofOtto1923
not rated yet Apr 16, 2018
BTW, the first autonomous aircraft landing on an aircraft carrier was almost 5 years ago and that UAV could carry ~5,000 pounds of munitions. If it can't be built today, give it 10 years
"...more than 30 countries have or are developing armed drones, with at least eight countries known to have used them in combat, including the U.S., Israel, U.K., Pakistan, Nigeria, Iran, Iraq and Turkey... Iran – which reportedly has flown drones such as the Shahed-129 over Iraq and Syria – also has been known to export its drone technology, albeit to non-state proxy actors such as Hamas and Hezbollah, and perhaps even a maritime and aerial drones to the Houthis in Yemen.

"Other non-state actors, such as ISIS, have turned to small commercial drones for ISR and even explosives delivery – flying IEDs that all but halted the advance of Iraqi troops into the city of Mosul..."

Flying IEDs... prompting a brand new gen of countermeasures, top-armored vehicles, command facilities...
Da Schneib
5 / 5 (1) Apr 16, 2018
@GOO, the definition of propaganda is not "something you don't want to hear."
Mark Thomas
5 / 5 (1) Apr 16, 2018
No I'm saying they WILL be built. Cheaply and easily. Swarms of drones and herds of armored bots.


Hey, we seem to be mostly in agreement on something. However, I think you are too cavalier about playing with this fire. It is far better we don't go down this path at all if we can avoid it. Even if the U.S. were to program "moral" terminators, you can bet our opponents won't bother to even try. It seems like you don't think there is any weapons technology we should not develop. Attitudes like that will probably mean the end of the human race as technology keeps improving century after century.

Why not focus on colonizing the solar system together instead of developing new ways to kill each other?
TheGhostofOtto1923
1 / 5 (1) Apr 17, 2018
Even if the U.S. were to program "moral" terminators, you can bet our opponents won't bother to even try
So how do we stop north Korea from building millions of IED drones? Nuke them?
Why not focus on colonizing the solar system together instead of developing new ways to kill each other?
Hey why don't we just make bad guys illegal?

You're an idiot.

Hey why don't we just make idiots illegal? I think we're onto something-
@GOO, the definition of propaganda is not "something you don't want to hear."
Of course not. That's YOUR definition. It's why liberal arts universities won't let conservatives give talks. Or if they do the students shout them down.

Just look at how all the libtards here gang up on poor Otto as if he's a monster from hell for pointing out the obvious. It's because your weltanshuung is so fragile and really can't survive a little logic, is why.
antialias_physorg
5 / 5 (2) Apr 17, 2018
Once propaganda can be used to allow bad actors to co-opt government

What do you mean by 'once'? Is three out of three superpowers not enough?
Mark Thomas
5 / 5 (1) Apr 17, 2018
OttoBotto, you are the idiot for not realizing that sooner or later, business as usual will get us all killed. You act like there is no real risk to creating killer robots or any other weapons technology that the human race can dream up, and you are WRONG.

"Hey why don't we just make bad guys illegal?"

Otto, there have always been "bad guys." The key is to resist and contain them as best as we can, not give up and let them run amok. That is why our criminal justice system exists. That is supposedly why the U.S. is armed to the teeth. If our $700B annual defense budget is not enough to contain the bad guys, I want to know why the heck not?!

We have treaties limiting biological and chemical warfare and they have mostly prevented the use of those weapons. Yes, no agreement is perfect and yes, scum-bags like Assad in Syria will ignore them, but most people are not in fear of biological or chemical weapon attack. Similarly, we should consider a treaty banning killer robots.
Da Schneib
not rated yet Apr 17, 2018
Errr, @anti, that's a standard figure of speech in English that you're taking out of context and misrepresenting. I see that a lot from science trolls, but expect better from you. It doesn't matter whether it's already happened or not.
Mark Thomas
5 / 5 (1) Apr 17, 2018
Again, I ask why not focus on colonizing the solar system together instead of developing new ways to kill each other?

I am a lowly member of the governed, not the ruling elite, but I still have the power to vote. As a voter, I want to be inspired by seeing my brothers and sisters on Mars, not disgusted watching them get terrorized and slaughtered in third world countries by killer robots. I want to be a member of advanced race capable of at least interplanetary travel, not a foolish race that destroyed itself. It is not too much to ask.
TheGhostofOtto1923
not rated yet Apr 17, 2018
I am a lowly member of the governed, not the ruling elite, but I still have the power to vote. As a voter, I want to be inspired by seeing my brothers and sisters on Mars
So you want the taliban or north korea waiting for them when they get there? You want to let ISIS develop the tech to rain asteroids down on us?

What makes you think you can just ignore enemies and they will go away?
antialias_physorg
5 / 5 (1) Apr 18, 2018
Errr, @anti, that's a standard figure of speech in English that you're taking out of context and misrepresenting. I see that a lot from science trolls, but expect better from you. It doesn't matter whether it's already happened or not.

I was just being flippant. Don't worry ;)

I see the move towards killer-robots and a complete overwatch state as inevitable. There are no longer any checks and balances (least of all the popular vote).

Mark Thomas
5 / 5 (1) Apr 18, 2018
I see the move towards killer-robots and a complete overwatch state as inevitable.


I am less pessimistic, but I certainly understand the viewpoint. For example, I suspect the dire need for killer robot countermeasures will help drive the development of EMPs and HERFs as both offensive and defensive weapons, leading to even more problems. Again, this ever-escalating cycle of weapons development shows all signs of becoming a serious problem over time. If we don't learn to control ourselves better, we are going to be in huge trouble. We have some positive experience regulating ABC weapons (Atomic, Biological and Chemical), so now it is time to add D (drones) and E (electromagnetic) to the alphabetical bad list.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.