Software enables avatar to reproduce our emotions in real time

Nov 19, 2012 by Cécilia Carron
Credit: 2012 Alain Herzog

(Phys.org)—You move, he moves. You smile, he smiles. You get angry, he gets angry. "He" is the avator you chose. Faceshift, from EPFL's Computer Graphics and Geometry Laboratory, now offers a software program that could save time for the designers of animation or video games. Thibaut Weise, founder of the start-up, smiles and nods. On the screen his avatar, a fantasy creature, directly reproduces his gestures. This system could enhance the future of video games or even make video chats more fun.

One tool required: a camera that has motion and depth sensors in the style of Microsoft Kinect or Asus Xtion, well known to gamers. During its first use, the software needs only ten minutes to recognize the user's face. The user reproduces several basic expressions requested by the program: smile, raise , etc. "The more movement is incorporated into the program's 50 positions, the more realistic are the results," explains Thibaut Weise, creator of the start-up currently based at the Technopark in Zurich. Then you can get into the skin of your character and animate by moving yourself. "It's almost like leaving your body to enter that of your avatar," jokes the young entrepreneur.

This video is not supported by your browser at this time.
A virtual character produces the same facial expressions as its user. It makes a video game, chat, or an animated film both fun and fast. Faceshift, an EPFL spin-off, launches its software on the market today.

Saving time for animated films

The challenge for the research team in the laboratory of and Geometry was to find an algorithm to superimpose the depth data from the camera with the color of the image and avatar in one step. They demonstrated that 3D could be reconstructed in real time without using facial markers or complex scanning hardware.

In an or a video game, the of characters are defined with a program that permits the movement of different parts of the face, step by step. To simulate anger, for example, it's necessary to knit each eyebrow in two or three clicks, then stretch the mouth down, and so on. With the Faceshift software, and "emotions" of the avatar follow those of the actor, rendering the work more fun and certainly faster. "This new tool can reduce the time to make a film by up to 30%," asserts Weise.

One imagines the purpose is to directly animate the face of one's avatar in a video game. Already in contact with the major designers of video games, Thibaut Weise believes that the next generation of 3D cameras will enable his company to take off. In the meantime, it provides versions for the general public, integrated into applications such as Skype or online gaming.

Explore further: Oculus unveils new prototype VR headset

Related Stories

Asus, PrimeSense to bring motion controls to PCs

Jan 03, 2011

(AP) -- If you've been wishing you could ditch your clunky computer mouse and control your PC with gestures - the way you can using Microsoft Corp.'s Kinect motion controller for the Xbox 360 gaming console ...

Microsoft to add Kinect Fusion to Kinect for Windows SDK

Nov 06, 2012

(Phys.org)—Senior Program Manager for Microsoft's Kinect for Windows, Chris White recently announced via a blog post that Kinect Fusion will soon be incorporated into the Kinect for Windows Software Development Kit (SDK). Th ...

A new window to the face

Aug 09, 2011

The human face is a complicated thing—powered by 52 muscles; contoured by the nose, eyebrows, and other features; and capable of an almost infinite range of expressions, from joy to anger to sorrow to ...

New algorithm to improve video game quality

Feb 18, 2010

Research presented in a paper by Morgan McGuire, assistant professor of computer science at Williams College, and co-author Dr. David Luebke of NVIDIA, introduces a new algorithm to improve computer graphics for video games.

Recommended for you

Oculus unveils new prototype VR headset

Sep 20, 2014

Oculus has unveiled a new prototype of its virtual reality headset. However, the VR company still isn't ready to release a consumer edition.

Who drives Alibaba's Taobao traffic—buyers or sellers?

Sep 18, 2014

As Chinese e-commerce firm Alibaba prepares for what could be the biggest IPO in history, University of Michigan professor Puneet Manchanda dug into its Taobao website data to help solve a lingering chicken-and-egg question.

Computerized emotion detector

Sep 16, 2014

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

User comments : 0