Your Next Computer May Know How You Feel

April 14, 2010

(PhysOrg.com) -- Friends, loved ones and pets can sense your mood almost instantly - and one day your computer may be able to do so pretty quickly as well.

UT Dallas scientist Yang Liu has received a three-year, $350,000 grant from the highly competitive Air Force Office of Scientific Research's Young Investigator Research Program to explore emotion recognition and modeling in speech processing.

“The next-generation human-computer interaction interfaces will be more human-centered and socially intelligent,” Liu said. “They’ll have the ability to detect changes in the user’s affective behavior and thus initiate interactions accordingly. Automatic recognition of emotion plays an important role in developing future intelligent systems.”

Emotion is associated with various physical indicators, including facial expression, , tone of voice, word usage and movement. Liu and a team of graduate students will focus primarily on emotion recognition and modeling in speech.

They’ll study features such as pitch, intonation patterns and word usage and then associate those with emotions such as anger, sadness, happiness, surprise and frustration. Other efforts to gauge emotion from speech have achieved an accuracy rate of 60 to 80 percent. Liu hopes to improve upon those numbers.

“Automatic recognition of emotions with high accuracy still remains an elusive goal,” she said.

But her research adds a cultural component.

“We’re interested in studying the cross-lingual aspects of emotion in English and other languages, such as Chinese,” Liu said. “This way we can look for the influence of culture and language in emotions.”

She’s doing the research in collaboration with several other UT Dallas faculty who are working in similar areas.

The research could drive a virtually unlimited range of applications. A tutoring system, for example, could detect frustration or boredom in a student - a sure sign the student is not learning and a different approach is needed - perhaps triggering the application to slow down the lesson or load a different one. An interactive voice-response system that detects anger or frustration in a customer might transfer that person to a human operator. An emotion component could be added to a polygraph or lie-detector system used by law enforcement. And such technology could assist in non-pharmacological treatment of social anxiety disorders.

Liu first became interested in speech and language processing as an electrical engineering undergrad at Tsinghua University in Beijing. She joined UT Dallas in 2005 as an assistant professor in the Erik Jonsson School of Engineering and Computer Science after completing postdoctoral work at the International Computer Science Institute (where she also conducted most of her PhD research) in Berkeley, Calif. She received her PhD in electrical and computer engineering from Purdue University in 2004. Her other research interests include speech summarization of meetings, spoken dialogue systems, natural language processing, and machine learning and data mining.

Explore further: Can't Make it to a Meeting? Send a Computer Instead

Related Stories

Can't Make it to a Meeting? Send a Computer Instead

August 6, 2009

(PhysOrg.com) -- If you’ve ever wished you had an assistant to attend meetings with you, take notes and produce a concise summary, then you’ll be pleased to know that UT Dallas computer scientist Yang Liu hopes to one-up ...

Smart lighting within reach

January 4, 2007

It's not often that an engineer finds inspiration for their research at the ballet. But for University of Queensland graduate Aaron Tan, the theatre was the perfect place to start his search for smarter lighting design.

Google developing a translator for smartphones

February 9, 2010

(PhysOrg.com) -- Google is developing a translator for its Android smartphones that aims to almost instantly translate from one spoken language to another during phone calls.

Can you see the emotions I hear? Study says yes

May 14, 2009

By observing the pattern of activity in the brain, scientists have discovered they can "read" whether a person just heard words spoken in anger, joy, relief, or sadness. The discovery, reported online on May 14th in Current ...

Recommended for you

A not-quite-random walk demystifies the algorithm

December 15, 2017

The algorithm is having a cultural moment. Originally a math and computer science term, algorithms are now used to account for everything from military drone strikes and financial market forecasts to Google search results.

FCC votes along party lines to end 'net neutrality' (Update)

December 14, 2017

The Federal Communications Commission repealed the Obama-era "net neutrality" rules Thursday, giving internet service providers like Verizon, Comcast and AT&T a free hand to slow or block websites and apps as they see fit ...

US faces moment of truth on 'net neutrality'

December 14, 2017

The acrimonious battle over "net neutrality" in America comes to a head Thursday with a US agency set to vote to roll back rules enacted two years earlier aimed at preventing a "two-speed" internet.

The wet road to fast and stable batteries

December 14, 2017

An international team of scientists—including several researchers from the U.S. Department of Energy's (DOE) Argonne National Laboratory—has discovered an anode battery material with superfast charging and stable operation ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

danielz
not rated yet Apr 14, 2010
I really love what technology can do, but sometimes I really wonder if it's wise...

Daniel.
www.scrabblecheat.org

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.