'Emotionsense' determines emotions by phone

Sep 29, 2010
'Emotionsense' determines emotions by phone

 A system which enables psychologists to track people’s emotional behavior through their mobile phones has been successfully road-tested by researchers.

"EmotionSense" uses speech-recognition software and phone sensors in standard to assess how people's emotions are influenced by factors such as their surroundings, the time of day, or their relationships with others.

It was developed by a University of Cambridge-led team of academics, including both psychologists and computer scientists. They will report the first successful trial of the system today at the Association for Computing Machinery's conference on Ubiquitous Computing in Copenhagen.

Early results suggest that the technology could provide psychologists with a much deeper insight into how our emotional peaks - such as periods of happiness, anger or stress - are related to where we are, what we are doing or who we are with.

"Everyone has a mobile phone, so potentially they are a perfect tool if you want to track the behaviour or emotional condition of large numbers of people," Dr. Cecilia Mascolo, from the University of Cambridge's Computer Laboratory, who led the research, said.

"What we are trying to produce is a completely non-intrusive means of achieving that which also respects privacy. In time, it could have an enormous impact on the way in which we study and give psychologists a deeper insight into what it is that makes different types of people tick."

EmotionSense uses the recording devices which already exist in many mobile phones to analyse audio samples of the user speaking. The samples are compared with an existing speech library (known as the "Emotional Prosody Speech and Transcripts Library") which is widely used in emotion and research. The library consists of actors reading a series of dates and numbers in tones representing 14 different emotional categories.

From here, the samples are grouped into five broader categories - "Happy" emotions (such as elation, or interest); "Sadness"; "Fear", "Anger" (which includes related emotions such as disgust) and "Neutral" emotions (such as boredom or passivity.

The data can then be compared with other information which is also picked up by the phone. Built-in GPS software enables researchers to cross-refer the audio samples with the user's location, Bluetooth technology can be used to identify who they were with and the phone also records data about who they were talking to and at what time the conversation took place.

The software is also set up so that the analysis is carried out on the phone itself. This means that data does not need to be transmitted elsewhere and can be discarded post-analysis with ease to maintain user privacy.

As reported in their conference paper, the research team tested the effectiveness of the system on a group of 18 volunteers at the University of Cambridge earlier this year.

Each subject was given a modified Nokia 6210 Navigator phone for a period of 10 days. They were also asked to keep a diary in which they recorded their emotional state according to a standard set of questions already used by social and behavioural .

The results showed that in about 70% of cases, the emotional analysis offered by the phone system agreed with the results of the survey, suggesting that with further modification this type of mobile phone technology could be a very accurate means of tracking the factors influencing people's emotions.

The pilot study also threw up some interesting suggestions about how circumstances may affect our condition. Location appeared to have a pronounced effect on the users' state of mind. "Happy" emotions dominated the data when they were in residential locations (45% of all emotions recorded), but in workplaces "sad" emotions became the norm (54%).

The researchers also found that users exhibited more intense emotions in the evening than in the morning and that people tended to express their emotions far more in smaller groups than in larger crowds.

The research team is now working to refine the system further, by improving its emotion classification and its response to background noise.

Dr. Jason Rentfrow, a social psychologist at the University of Cambridge who also took part in the research, said: "This technology has the potential to transform the ways in which scientists study psychological states and social behaviour. The methods most often used rely on self-reports, which are subject to a number of limitations - people forget certain details and are sometimes inaccurate at reporting how often they engaged in particular tasks. Mobile sensing technology can overcome those limitations, providing unobtrusive and objective information about social behaviours and activities."

Explore further: Avatars make the Internet sign to deaf people

More information: www.cl.cam.ac.uk/research/srg/netos/emotionsense/

Related Stories

Links found between happiness and health

Dec 14, 2005

Carnegie Mellon University scientists say there's growing evidence positive emotions such as happiness are linked with good health and increased longevity.

Can you see the emotions I hear? Study says yes

May 14, 2009

By observing the pattern of activity in the brain, scientists have discovered they can "read" whether a person just heard words spoken in anger, joy, relief, or sadness. The discovery, reported online on May 14th in Current Bi ...

Perception of emotion is culture-specific

Sep 15, 2010

Want to know how a Japanese person is feeling? Pay attention to the tone of his voice, not his face. That's what other Japanese people would do, anyway. A new study examines how Dutch and Japanese people assess others' emotions ...

New technology helps visually impaired to 'see' emotions

Apr 27, 2010

Without vision it's impossible to interpret facial expressions, or so it's believed. Not any more. Shafiq ur Rehman, Umea University, presents a new technology in his doctoral thesis - a Braille code of emotions. ...

Recommended for you

Avatars make the Internet sign to deaf people

Aug 29, 2014

It is challenging for deaf people to learn a sound-based language, since they are physically not able to hear those sounds. Hence, most of them struggle with written language as well as with text reading ...

Chameleon: Cloud computing for computer science

Aug 26, 2014

Cloud computing has changed the way we work, the way we communicate online, even the way we relax at night with a movie. But even as "the cloud" starts to cross over into popular parlance, the full potential ...

User comments : 0