Meet Nexi, MIT Media Lab's latest robot and Internet star

Apr 10, 2008

A new experimental robot from the MIT Media Lab can slant its eyebrows in anger, or raise them in surprise, and show a wide assortment of facial expressions to communicate with people in human-centric terms. Called Nexi, it is aimed at a range of applications for personal robots and human-robot teamwork.

Nexi has become something of an Internet celebrity after a preliminary video demonstration of its facial expressions using pre-scripted movements was posted this month on YouTube. The spot has been accessed more than 70,000 times, and viewers have reacted with comments ranging from awe and bemusement ("This robot seems more humane then most humans") to shock and alarm ("Creepy. Very creepy").

Created by a group headed by Media Lab's Cynthia Breazeal, known for earlier expressive robots such as Kismet, the new product is known as an MDS (mobile, dextrous, social) robot. Unlike Kismet, which consisted only of a robotic head, the Nexi MDS is a complete mobile manipulator robot augmented with rich expressive abilities.

It is designed to ultimately ride on self-balancing wheels like the Segway transporter, but it currently uses an additional set of supportive wheels to operate as a statically stable platform in its early stage of development. It has hands to manipulate objects, eyes (video cameras), ears (an array of microphones), and a 3-D infrared camera and laser rangefinder to support real-time tracking of objects, people and voices as well as indoor navigation.

The development of Nexi was led by the MIT Media Lab's Personal Robots Group in collaboration with Prof. Rod Grupen at the University of Massachusetts-Amherst and two MIT robotic spin-off companies. The project was originally funded by an Office of Naval Research Defense University Research Instrumentation Program (DURIP) award to develop a novel class of robots that can engage in sophisticated forms of peer-to-peer teamwork with humans in uncertain environments.

This video is not supported by your browser at this time.

A recent ONR Multidisciplinary University Research Initiative (MURI) award, for which Breazeal is the PI, aims at developing technologies and demonstrations for teams comprised of humans and autonomous aerial robots in addition to the MDS robots. Several MIT faculty are part of the MURI effort (Nick Roy and Jon How in Aero Astro, and Deb Roy at the Media Lab) in addition to other collaborators at Stanford, Vanderbilt, UMass-Amherst and University of Washington.

Source: MIT

Explore further: Robots recognize humans in disaster environments

add to favorites email to friend print save as pdf

Related Stories

Will tomorrow's robots move like snakes?

Sep 16, 2014

Over the last few years, researchers at MIT's Computer Science and Artificial Intelligence Lab (CSAIL) have developed biologically inspired robots designed to fly like falcons, perch like pigeons, and swim ...

MIT ATLAS robot demo shows advanced moves (w/ Video)

Sep 07, 2014

The bipedal robot ATLAS from MIT is moving on. Reacting to the recent video of "MIT Atlas truckin' with a truss," TechCrunch said, "We've seen the cute little guy walk, toddle, and climb over obstacles but ...

Recommended for you

Robots recognize humans in disaster environments

16 hours ago

Through a computational algorithm, a team of researchers from the University of Guadalajara (UDG) in Mexico, developed a neural network that allows a small robot to detect different patterns, such as images, ...

Japan toymaker unveils tiny talking, singing humanoid

Oct 15, 2014

Japanese toymaker Tomy on Wednesday unveiled a multi-talented humanoid robot, named "Robi jr.," which can converse using some 1,000 phrases and belt out about 50 songs, as well as move its limbs and head.

User comments : 0