Face science meets robot science

July 5, 2011

Your brain processes lots of tiny and subtle clues about faces whenever you interact with other people, and now scientists from Queen Mary, University of London and UCL (University College London) are investigating whether robots and computers can learn to do the same thing.

The team will showcase their work as part of the annual exhibition which runs from 5 – 10 July 2011. Visitors will be able to see how the brain understands faces, what their faces look like when they switch gender, how to transfer motions from one person's face to another and see state of the art computer vision systems that can recognise facial expressions.

Professor Peter McOwan, from the School of Electronic Engineering and Computer Science at Queen Mary, University of London, explains: "We will be showing some of the latest research from the EU funded LIREC project, which aims to create socially aware companion robots and graphical characters. There will be the opportunity for those attending to see if our computer vision system can detect their smiles, watch the most recent videos of our robots in action and talk to us about the project."

If we can understand how we break up facial movement into elementary facial actions and how actions vary between people, this will help computer scientists to both analyse facial movement and build realistic motion into avatars, making avatars more acceptable to people as channels of communication.

Professor McOwan adds: "Robots are going to increasingly form part of our daily lives – for instance robotic aids used in hospitals or much later down the road sophisticated machines that we will have working in our homes. Our research aims to develop software, based on biology, that will allow robots to interact with humans in the most natural way possible - understanding the things we take for granted like personal space or reacting to an overt emotion such as happiness."

Co researcher Professor Alan Johnston, from the UCL Division of Psychology and Language Sciences added: "A picture of a face is just a frozen sample drawn from a highly dynamic sequence of movements. Facial motion transfer onto other faces or average avatars provides an extremely important tool for studying dynamic face perception in humans as it allows experimenters to study facial motion in isolation from the form of the face."

Co researcher Professor Cecilia Heyes, from All Souls College, University of Oxford, points out: "This technology has all kinds of great spin-offs. We're using it to find out how people imitate facial expressions, which is very important for rapport and cooperation, and why people are better at recognizing their own facial movements than those of their friends – even though they see their friends much more often than their own."

Explore further: Developing long-term relations with robots

Related Stories

Developing long-term relations with robots

April 13, 2008

Scientists at Queen Mary, University of London are leading an international project which is set to advance the relationship between robots and humans, as part of new European project called LIREC - Living with Robots and ...

Can't place that face? The trouble may be in your neurons

July 28, 2010

A specific area in our brains is responsible for processing information about human and animal faces, both how we recognize them and how we interpret facial expressions. Now, Tel Aviv University research is exploring what ...

The brain needs to remember faces in 3-dimensions

September 9, 2010

In our dynamic 3D world, we can encounter a familiar face from any angle and still recognize that face with ease, even if the person has, for example, changed his hair style. This is because our brain has used the 2D snapshots ...

Recommended for you

Team develops targeted drug delivery to lung

September 2, 2015

Researchers from Columbia Engineering and Columbia University Medical Center (CUMC) have developed a new method that can target delivery of very small volumes of drugs into the lung. Their approach, in which micro-liters ...

Not another new phone! But Nextbit's Robin is smarter

September 2, 2015

San Francisco-based Nextbit wants you to meet Robin, which they consider as the smarter smartphone. Their premise is that no one is making a smart smartphone; when you get so big it's hard to see the forest through the trees. ...

Team creates functional ultrathin solar cells

August 27, 2015

(Phys.org)—A team of researchers with Johannes Kepler University Linz in Austria has developed an ultrathin solar cell for use in lightweight and flexible applications. In their paper published in the journal Nature Materials, ...

Magnetic fields provide a new way to communicate wirelessly

September 1, 2015

Electrical engineers at the University of California, San Diego demonstrated a new wireless communication technique that works by sending magnetic signals through the human body. The new technology could offer a lower power ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.