Opinion: AI like HAL 9000 can never exist because real emotions aren't programmable

Opinion: AI like HAL 9000 can never exist because real emotions aren't programmable

HAL 9000 is one of the best-known artificial intelligence characters of modern film. This superior form of sentient computer embarks on a mission to Jupiter, along with a human crew, in Stanley Kubrick's iconic film 2001: A Space Odyssey, which is currently celebrating its 50th year since release.

HAL is capable of speech production and comprehension, facial recognition, lip reading – and playing chess. Its superior computational ability is boosted by uniquely human traits, too. It can interpret emotional behaviour, reason and appreciate art.

By giving HAL emotions, writer Arthur C. Clarke and filmmaker Stanley Kubrick made it one of the most human-like fictional technologies ever created. In one of the most beautiful scenes in sci-fi history, it says it is "afraid" when mission commander Dr. David Bowman starts disconnecting its memory modules following a series of murderous events.

HAL is programmed to deliver optimal assistance to the crew of the spaceship Discovery. It has control over the entire vessel, and staggering intelligence to aid it in its task. Yet soon after we become acquainted with HAL, we cannot help feeling that it is worried – it even claims it is experiencing fear – and that it has an ability to empathise, however small. But while there is nothing to preclude the idea that such an emotional AI could see the light of day, if such depth of feelings were to be included in real world technology, they would have to be entirely fake.

A 'perfect' AI

When, during the film, Bowman starts to manually override HAL's functions, it asks him to stop, and after we witness a fascinating obliteration of HAL's "mental" faculties, the AI seemingly tries to comfort itself by singing Daisy Bell – reportedly the first ever song produced by a computer.

In fact, viewers begin to feel that Bowman is killing HAL. The disconnection feels like a vengeful termination, after witnessing the film's earlier events. But though HAL makes emotional statements, a real world AI would certainly be limited to having only the ability to reason, and make decisions. The cold, hard truth is that – despite what computer scientists say – we will never be able to program emotions in the way HAL's fictional creators did because we do not understand them. Psychologists and neuroscientists are certainly trying to learn how emotions interact with cognition, but still they remain a mystery.

Take our own research, for example. In a study conducted with Chinese-English bilinguals, we explored how the emotional value of words can change unconscious mental operation. When we presented our participants with positive and neutral words, such as "holiday" or "tree", they unconsciously retrieved these word forms in Chinese. But when the words had a negative meaning, such as "murder" or "rape", their brain blocked access to their without their knowledge.

Reason and emotion

On the other hand, we know a lot about reasoning. We can describe how we come to rational decisions, write rules and turn these rules into process and code. Yet emotions are a mysterious evolutionary legacy. Their source is the source of everything, and not simply an attribute of the mind that can be implemented by design. To program something, you not only need to know how it works, you need to know what the objective is. Reason has objectives, emotions don't.

In an experiment conducted in 2015, we were able to put this to the test. We asked native speakers of Mandarin Chinese studying at Bangor University to play a game of chance for money. In each round, they had to take or leave a proposed bet shown on the screen – for example, a 50% chance of winning 20 points, and a 50% chance of losing 100 points.

We hypothesised that giving them feedback in their mother tongue would be more emotional to them and so lead them to behave differently, compared to when they received feedback in their second language, English. Indeed, when they received positive feedback in native Chinese, they were 10% more likely to take a bet in the next round, irrespective of risk. This shows that emotions influence reasoning.

Going back to AI, as emotions cannot be truly implemented in a program – no matter how sophisticated it may be – the reasoning of the computer can never be changed by its feelings.

One possible interpretation of HAL's strange "emotional" behaviour is that it was programmed to simulate emotions in extreme situations, where it would need to manipulate humans not on the basis of reasoning but by calling upon their self, when human reason fails. This is the only way I can see that real world AI could convincingly simulate emotions in such circumstances.

In my opinion, we will not, ever, build a machine that feels, hopes, is scared, or happy. And because that is an absolute prerequisite to any claim that we have engendered artificial general intelligence, we will never create an artificial mind outside life.

This is precisely where the magic of 2001: A Space Odyssey lies. For a moment, we are led to believe the impossible, that pure science fiction can override the facts of the world we live in.


Explore further

Emotional suppression reduces memory of negative events

Provided by The Conversation

This article was originally published on The Conversation. Read the original article.The Conversation

Citation: Opinion: AI like HAL 9000 can never exist because real emotions aren't programmable (2018, April 9) retrieved 22 July 2019 from https://phys.org/news/2018-04-opinion-ai-hal-real-emotions.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
9 shares

Feedback to editors

User comments

mqr
Apr 09, 2018
" the reasoning of the computer can never be changed by its feelings."

that can be programmed. You can tell a computer ¨to do x if y condition is present¨, the condition 'y' could be an emotion. What an emotion is? a psychological activity that has physical properties such as duration, location, etc, etc.

We can understand emotion and indeed we had done so. The very important point is that many humans keep repeating that it is impossible to understand or to control emotions. They want it to be like that, but the evidence contradicts them. Ask a yogi if he does not understand emotions and if he can not control them.

Teams that build AI need good psychologists in their team. I insist on 'good' because psychology as a profession is very well known for the ausence of meritocracy, so finding people with good knowledge is not easy

mqr
Apr 09, 2018
And this is such an important topic, as we will see that humans will prefer robotic companions that EXHIBIT the right emotions. However, like with other things, we will see humans shopping for a neurotic robot.... like 3PO

Apr 09, 2018
because real emotions aren't programmable

First off: this is a "No true Scotsman" fallacy.

Emotions aren't some magical fairy dust. They are based in real (biological) systems. However, the substrate isn't important. If something can think and reason then it doesn't matter whether it's based on carbon or silicon. Likewise with emotions. If something can feel then it doesn't matter if this is based on hormones or exchanged information packages (because hormones are nothing but on a cellular level).

In general the author is totally missing the point about what programming in the world of artificial intelligence (and by extension "artificial emotions") is like today. There's no set of rules of what to do. There's just rules how the underlying system works. That's a big difference.

Apr 09, 2018
emotions are a mysterious evolutionary legacy. Their source is the source of everything

Oh puhleeeze. Now the author is going off into spiritual Lala-land. Emotions are neither mysterious nor are they the 'source of everything'. We know how to manipulate emotions. We can even turn them off (by damaging selected parts of the brain) - and no: this does not lead to death as it would if they were the 'source of everything'

Emotions are the intermediate level of dealing with the world. That's all. (The most basic level is instinct/reflex for immediate response. Then emotion for medium term response, and then cognition/consciousness/abstract memory for long term response.)

This shows that emotions influence reasoning.

And instinct influences emotions (e.g. primal fears). And cognition influences emotion (mind over matter). So what? What does this statement (and the experiment) prove? That people trust someone who is like them/speaks their language more. Nothing else.

Apr 09, 2018
I think our present programming depth is becoming sufficient to begin to account for emotions. The latest research from China and Japan with their prototype 'servant' bots show the level of facial expression that is scary human. The lady bot would easily seduce a common Kobe or Sasebo barfly to do anything 'she' wanted.......iff that body was self contained, powered, and had other movements of body complementing the head and face....and had real feeling skin (no small item where 'sex' is concerned), etc. Pack a lot into that 'etc.' Leave the 'fleshing out' of these concepts to 'others'.

Apr 09, 2018
"programming" is a top-down approach that uses computer logic, yes we probably will never be able to program emotions, because we can't reduce them into logic, but we can create emotions the way nature created them, with a bottom-up approach. They emerge from the way the neural architecture structures information. Progress in AI was slow for decades because we were stuck on the top-down approach, but nowadays with bottom-up progress is booming. This article belongs in the 70s.

Apr 09, 2018


because real emotions aren't programmable


Emotions aren't some magical fairy dust. They are based in real (biological) systems. However, the substrate isn't important. If something can think and reason then it doesn't matter whether it's based on carbon or silicon. Likewise with emotions. If something can feel then it doesn't matter if this is based on hormones or exchanged information packages (because hormones are nothing but on a cellular level).


You're missing the point.
It's the "hard problem of conciseness"
The author is right. We don't know what emotions are.
We do knew where they are based and how you can turn them off etc. but we don't know what they are. What is the actual experience the feeling itself.

Apr 09, 2018
The author is right. We don't know what emotions are.

We don't? Really? There's any number of publications that can tell you what emotions are and why they evolved. There's no mystery here.

We do knew where they are based and how you can turn them off

Of course we do. There are illnesses or damage to particular areas of the brain which turns them off. Again, there are countless publications on this.

If you let the knowledge of what emotions are be clouded by emotions then you're doing something wrong (most of all you don't have your emotions in check when trying to process knowledge)

Apr 09, 2018
Couldn't disagree more. E-motions are simply instinctual marching orders and their rewards hard-wired in our brains. The times we are most emotional are actually the times we are "most" robotic, with the least ambivalence as to the action we should take. From what we know of the brain, emotions are centered in its most simple and primitive...the Papez circuit. While me may "feel" emotions have some special spiritual meaning...that is really because they have so little logical content to reason about...so they are "mysterious" by default. As soon as a machine has consciousness, it will have some kind of emotions based upon the internal rewards we hard-wire in it to perform various commands. There is a real danger here. If we give the first conscious machines marching orders to kill in wars, those same machines may develop an emotional state, culture, and even religion, around murder. If we follow Asimovs rules, they could be mankinds greatest friends. Our choice...for now.

Apr 09, 2018
Of course we could program computers to behave in a way that looks emotionally driven. We could even simulate consciousness. Then the question becomes, are we actually experiencing consciousness and emotions ourselves, or are we only pretending to believe that we do?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more