Virtual humans, programmed to feel

Virtual humans, programmed to feel
Professor Stacy Marsella, who develops computer programs that simulate human emotion across a variety of applications, has joint appointments in the College of Science and the College of Computer and Information Science. Credit: Mariah Tauger.

A clenched fist thumps the air to emphasize a point; a sweeping hand signals the array of possibilities; furrowed eyebrows question the veracity of the politician's remarks. These are all examples of the ways we express our emotions while we converse. They're strategies we may spend a lifetime learning, based on our particular cultures and backgrounds. But that doesn't mean they can't be programmed.

Newly appointed Northeastern professor Stacy Marsella is doing just that. His program, called Cerebella, gives the same ability to convey emotion through and as they communicate with other virtual—or even real—humans.

'Normally these virtual human architectures have some sort of perception, seeing the world, forming some understanding of it, and then deciding how to behave,' said Marsella, who holds joint appointments in the College of Computer and Information Science and the College of Science. "The trouble is some of these things are very hard to model, so sometimes you cheat."

One way to cheat, Marsella explained, is to infer connections between given utterances and appropriate responses. Once the program knows what words a virtual human will use to respond, it can form a library of associated facial expressions, gaze patterns, and gestures that make sense in conjunction with those words.

In one version of the program, Cerebella infers the deeper meaning behind the spoken words. The program is capable of interpreting the meaning and responding appropriately.

In addition to Cerebella, Marsella's work touches on a broad spectrum of applications at the intersection of emotion and technology. For instance, UrbanSim uses similar techniques to generate large-scale models of human populations. Here, virtual models of people aren't doing the same kind of "," as Marsella called it, but they're still interacting with one another and determining follow-up behaviors based on a theory of mind, a model that allows them to reason about how others in the virtual world will act.

"They're abstract social interactions, where agents are either assisting or blocking each other," Marsella explained. The result gives his program the capacity to simulate whole cities for purposes ranging from city planning to military training.

At Northeastern, Marsella is eager to apply his methods to a range of multidisciplinary collaborative projects. In particular, he's interested in working with the personal health informatics team. "The interactive health interventions are the applications that really interest me," he said.

For another project, he designed a training tool for medical students to develop their patient interaction skills, in which they must navigate difficult conversations with a virtual human embedded with the emotional personality of a real human. One task requires the students to inform the virtual human of his cancer diagnosis.

"We want these interactions to be natural," Marsella said, summing up the underlying goal of almost all his programs.

Citation: Virtual humans, programmed to feel (2014, May 8) retrieved 28 March 2024 from https://phys.org/news/2014-05-virtual-humans.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Adults with autism virtually learn how to get the job

0 shares

Feedback to editors