What human emotions do we really want of artificial intelligence?

August 28, 2015 by David Lovell, The Conversation
The challenge in making AI machines appear more human. Credit: Flickr/Rene Passet, CC BY-NC-ND

Forget the Turing and Lovelace tests on artificial intelligence: I want to see a robot pass the Frampton Test.

Let me explain why rock legend Peter Frampton enters the debate on AI.

For many centuries, much thought was given to what distinguishes humans from animals. These days thoughts turn to what distinguishes humans from .

The British code breaker and computing pioneer, Alan Turing, proposed "the imitation game" (also known as the Turing test) as a way to evaluate whether a machine can do something we humans love to do: have a good conversation.

If a human judge cannot consistently distinguish a machine from another human by conversation alone, the machine is deemed to have passed the Turing Test.

Initially, Turing proposed to consider whether machines can think, but realised that, thoughtful as we may be, humans don't really have a clear definition of what thinking is.

Tricking the Turing test

Maybe it says something of another human quality – deviousness – that the Turing Test came to encourage computer programmers to devise machines to trick the human judges, rather than embody sufficient intelligence to hold a realistic conversation.

This trickery climaxed on June 7, 2014, when Eugene Goostman convinced about a third of the judges in the Turing Test competition at the Royal Society that "he" was a 13-year-old Ukrainian schoolboy.

Eugene was a chatbot: a computer program designed to chat with humans. Or, chat with other chatbots, for somewhat surreal effect (see the video, below).

The video will load shortly

And critics were quick to point out the artificial setting in which this deception occurred.

The creative mind

Chatbots like Eugene led researchers to throw down a more challenging gauntlet to machines: be creative!

In 2001, researchers Selmer Bringsjord, Paul Bello and David Ferrucci proposed the Lovelace Test – named after 19th century mathematician and programmer Ada, Countess of Lovelace – that asked for a computer to create something, such as a story or poem.

Computer generated poems and stories have been around for a while, but to pass the Lovelace Test, the person who designed the program must not be able to account for how it produces its creative works.

Mark Riedl, from the School of Interactive Computing at Georgia Tech, has since proposed an upgrade (Lovelace 2.0) that scores a computer in a series of progressively more demanding creative challenges.

This is how he describes being creative:

In my test, we have a human judge sitting at a computer. They know they're interacting with an AI, and they give it a task with two components. First, they ask for a creative artifact such as a story, poem, or picture. And secondly, they provide a criterion. For example: "Tell me a story about a cat that saves the day," or "Draw me a picture of a man holding a penguin."

But what's so great about creativity?

Challenging as Lovelace 2.0 may be, it's argued that we should not place creativity above other human qualities.

This (very creative) insight from Dr Jared Donovan arose in a panel discussion with roboticist Associate Professor Michael Milford and choreographer Prof Kim Vincs at Robotronica 2015 earlier this month.

Amid all the recent warnings that AI could one day lead to the end of humankind, the panel's aim was to discuss the current state of creativity and robots. Discussion led to questions about the sort of emotions we would want intelligent machines to express.

Empathy – the ability to understand and share feelings of another – was top of the list of desirable human qualities that day, perhaps because it goes beyond mere recognition ("I see you are angry") and demands a response that demonstrates an appreciation of emotional impact.

Hence, I propose the Frampton Test, after the critical question posed by rock legend Peter Frampton in the 1973 song "Do you feel like we do?"

True, this is slightly tongue in cheek, but I imagine that to pass the Frampton Test an artificial system would have to give a convincing and emotionally appropriate response to a situation that would arouse feelings in most humans. I say most because our species has a spread of emotional intelligence levels.

I second that emotion

Noting that others have explored this territory and that the field of "affective computing" strives to imbue machines with the ability to simulate empathy, it is still fascinating to contemplate the implications of emotional machines.

This July, AI and robotics researchers released an open letter on the peril of autonomous weapons. If machines could have even a shred of empathy, would we fear these developments in the same way?

This reminds us, too, that human emotions are not all positive: hate, anger, resentment and so on. Perhaps we should be more grateful that the machines in our lives don't display these feelings. (Can you imagine a grumpy Siri?)

Still, there are contexts where our nobler emotions would be welcome: sympathy and understanding in health care for instance.

As with all questions worthy of serious consideration, the Robotronica panellists did not resolve whether robots could perhaps one day be creative, or whether indeed we would want that to pass.

As for machine emotion, I think the Frampton Test will be even longer in the passing. At the moment the strongest emotions I see around robots are those of their creators.

Explore further: Professor proposes alternative to 'Turing Test'

Related Stories

Professor proposes alternative to 'Turing Test'

November 19, 2014

(Phys.org) —A Georgia Tech professor is offering an alternative to the celebrated "Turing Test" to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test - originally called the ...

Move over, Turing Test. Winograd Schema Challenge in town

August 14, 2014

Isn't there something better than the Turing test to measure computer intelligence? Is the Turing Test the best we have to judge a machine's capability to produce behavior that requires human thought? Doubt was expressed ...

Can we teach robots right from wrong?

October 14, 2014

From performing surgery and flying planes to babysitting kids and driving cars, today's robots can do it all. With chatbots such as Eugene Goostman recently being hailed as "passing" the Turing test, it appears robots are ...

Recommended for you

Samsung to disable Note 7 phones in recall effort

December 9, 2016

Samsung announced Friday it would disable its Galaxy Note 7 smartphones in the US market to force remaining owners to stop using the devices, which were recalled for safety reasons.

Swiss unveil stratospheric solar plane

December 7, 2016

Just months after two Swiss pilots completed a historic round-the-world trip in a Sun-powered plane, another Swiss adventurer on Wednesday unveiled a solar plane aimed at reaching the stratosphere.

Solar panels repay their energy 'debt': study

December 6, 2016

The climate-friendly electricity generated by solar panels in the past 40 years has all but cancelled out the polluting energy used to produce them, a study said Tuesday.

11 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Kedas
not rated yet Aug 28, 2015
A filter of the emotions will lead to a less true communication.
And the main goal of the unit will direct the conversions/actions.
10min
3.7 / 5 (3) Aug 28, 2015
Humans behaving emotionally are irrational, annoying, nagging.

Give me a machine without emotions, and I'm fine. Don't waste resources.
James_Morgan
1 / 5 (1) Aug 28, 2015
In a true AI, I don't think we are going to be able to pick and choose which emotions it has - it's going to be 'alive'.
Many movies cover this subject and all the bad things that could happen when we do.
gkam
3.1 / 5 (7) Aug 28, 2015
We are not ruled by rationality as we assume, but are enslaved to the secretions of ductless glands which drive us with feelings and emotions..
antigoracle
2.7 / 5 (3) Aug 28, 2015
Two "forces" will drive the evolution of AI, war and sex, neither of which require human emotion and I dare say be better without.
TheGhostofOtto1923
1.6 / 5 (7) Aug 28, 2015
We are not ruled by rationality as we assume, but are enslaved to the secretions of ductless glands which drive us with feelings and emotions..
Heres george trying to appear profound.

How pathetic.
antigoracle
1 / 5 (1) Aug 28, 2015
We are not ruled by rationality as we assume, but are enslaved to the secretions of ductless glands which drive us with feelings and emotions..

So, being a pathological liar is not your fault!!
Yep, try again.
dirk_bruere
not rated yet Aug 29, 2015
Anger, hate, resentment, greed and a sense of self pity should make it just like Humans
TopCat22
not rated yet Aug 29, 2015
emotions, both negative and positive, are no longer necessary based on the degree of evolution and intellectual progress made by mankind. evolution created emotions as way to guide actions of organisms at a time when there was little knowledge and less prospects for sharing knowledge. it would be that someone born alone on an island could survive without needing to obtain knowledge from another and could create its own knowledge afresh from its own experiences. if we can (as we likely will soon) program an organism with enough knowledge for it to survive, flourish and reproduce then the emotions would no longer be necessary to guide evolution and survival.
TheGhostofOtto1923
2 / 5 (4) Aug 29, 2015
We are not ruled by rationality as we assume, but are enslaved to the secretions of ductless glands which drive us with feelings and emotions..
Heres george trying to appear profound.

How pathetic.
"Virtually all of the research on psychopaths reveals an inner world that is banal, sophomoric, and devoid of the color and detail that generally exists in the inner world of normal people. This goes a long way to explain the inconsistencies and contradictions in their speech..." [and also georges penchant for floods of 1-line t-shirt slogan posts]
TheGhostofOtto1923
1 / 5 (1) Aug 29, 2015
-Psychopaths actually do exhibit a very real form of artificial intelligence.

"Psychopaths use words about emotions the same way people who are color blind use words about colors they cannot perceive. Psychopaths not only learn to use the words more or less appropriately, they learn to pantomime the feeling. But they never HAVE the feeling."

"Oh, indeed, they can imitate feelings, but the only real feelings they seem to have - the thing that drives them and causes them to act out different dramas for effect - is a sort of "predatorial hunger" for what they want. That is to say, they "feel" need/want as love, and not having their needs/wants met is described as "not being loved" by them. What is more, this "need/want" perspective posits that only the "hunger" of the psychopath is valid, and anything and everything "out there," outside of the psychopath, is not real except insofar as it has the capability of being assimilated to the psychopath as a sort of "food.""

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.