Your smartphone is looking at you – but can it read your emotions?

March 12, 2014 by Lisa A Williams And Eliza Bliss-Moreau, The Conversation
What can a smartphone tell by looking at your face? Credit: Flickr/Peter Ras, CC BY-NC-SA

Smartphones can already understand your voice commands but imagine if they tried to read your emotions as well.

What if you asked it for details of movies showing at your local cinema and it replied:

Well, you look a bit sad. A rom-com should cheer you up! I'll find local listings now.

Such emotion-aware technology may seem a long way off – but not if Apple or any number of other recent start-ups (Affectiva, Emotient, nViso and Realeyes) have anything to say about it.

Your face is a window to your emotion - so why not capitalise (and commercialise) on it?

What your face says

The premise is simple – a smartphone device will encode a user's using the built in camera. The device will then infer the state of the user and modify any response accordingly:

  • feeling sad? Maybe it's a good time to show the funny advertisement
  • feeling angry? Perhaps the difficulty level of a game should be lowered
  • feeling happy? Show a shopper the products he bought the last time he was happy.

The logic of such technology draws support from a that we can "read" the emotional states of others by looking at their faces (test yourself on this body language quiz). With just a glance we are often quick to judge whether someone is afraid or disgusted, happy or surprised.

This belief is supported by studies conducted in the mid-20th century by US psychologist Paul Ekman demonstrating that people are able to match people's faces with emotion content (words that describe emotions, stories about emotions).

While this idea has permeated the scientific literature and popular belief, there is new research that calls these findings, and the general ability to "read" emotion faces, into question.

What is she feeling? The answer isn’t so simple as just reading her face. Credit: Flickr/Andrew Imanaka, CC BY

Ekman-style emotion recognition findings (or the lack thereof) don't actually speak to whether someone's internal is accurately reflected on the face.

Are you lying to me?

It is possible that people might be able to see happiness in a smile, anger in scowling eyebrows, but that those faces are not consistently made when people experience those emotions. On that front the evidence is exceedingly weak despite the idea – popularised in the television drama Lie to Me – that truth and deception are leaked on to the face.

Take smiling. It's common sense that we smile when we are happy. But, we also smile when we are embarrassed or frustrated.

We are less likely to smile when we are alone, even if we are truly happy. We also sometimes make other faces when we are purportedly happy, such as Olympic medallists and other elite athletes.

Even when people are making very extreme faces, it can be difficult to tell emotional states apart based on the face alone. Facial displays don't appear to correspond to emotions in specific or unique ways as robustly as previously thought.

All of this research stands in the face (pun intended) of what most of us think that we know about perceiving emotion in others. So how is it that we are capable of knowing what someone else feels?

Current thinking in emotion research suggests that multiple sources of information are used when we name the emotion we see in a face.

It's not just in the face

In addition to the cues on the face, information from posture, the voice, the context, and our own past experiences are incorporated into our judgements of others' emotions.

While it may seem like we are simply reading our friend's face when we deem that she is happy, we're doing much, much more than that. And, to check, we would want to ask her how she is feeling.

So is it possible for a smartphone to accurately know when we're filled with pride or trepidation by snapping a picture (or even recording video) of our ?

Emotion science says "no" – at least for now. Given the evidence, new technologies attempting to use facial displays to infer the emotional states of users are based on a rather shaky premise. On this front, smart devices are not very smart.

Apple appears to be hedging its bets. The US Patent Office is considering an application from Apple detailing an algorithm that integrates device usage patterns (such as switching between applications, websites browsed) and psychophysiological measures (such as heart rate and blood pressure) with facial displays to infer a user's emotional states.

Such an algorithm would likely do a better job than the camera alone, but we are a long way away from producing unequiovocal scientific evidence that supports that ability.

But we need to ask ourselves whether we want our devices attempting to perceive our emotions, even if the scientific evidence suggests they won't be able to do so accurately at the moment?

Do we ever want devices "reading" our emotions or should emotion perception be left to beings that can themselves feel emotions?

Explore further: Your face says it all? Not so fast

Related Stories

Your face says it all? Not so fast

March 5, 2014

It's a concept that had become universally understood: humans experience six basic emotions—happiness, sadness, anger, fear, disgust, and surprise—and use the same set of facial movements to express them. What's more, ...

Is the human brain capable of identifying a fake smile?

October 7, 2013

Human beings follows others' state of mind From their facial expressions. "Fear, anger, sadness, and surprise are quickly displeasure inferred in this way," David Beltran Guerrero, researcher at the University of La Laguna, ...

Perception of emotion is culture-specific

September 15, 2010

Want to know how a Japanese person is feeling? Pay attention to the tone of his voice, not his face. That's what other Japanese people would do, anyway. A new study examines how Dutch and Japanese people assess others' emotions ...

Babies know when you're faking

October 16, 2013

If you're happy and you know it, clap your hands! That's easy enough for children to figure out because the emotion matches the movement. But when feelings and reactions don't align, can kids tell there's something wrong? ...

Recommended for you

Startup Pi out to slice the charging cord

September 19, 2017

Silicon Valley youngster Pi on Monday claimed it had developed the world's first wireless charger that does away with cords or mats to charge devices.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.