MIT's Huggable Robot Teddy Enhances Human Relationships

December 17, 2008 by Lisa Zyga weblog
The Huggable robot, designed by researchers at the MIT Media Lab, was designed to enrich long-distance communication between users. Image credit: MIT Media Lab.

(PhysOrg.com) -- It's probably the most sophisticated teddy bear ever designed, but that doesn't stop MIT's companion robot called "the Huggable" from being pretty adorable, as well. The Huggable is the latest project to come from the MIT Media Lab, and could one day be used for healthcare, education, and social communication applications.

As the lab explains, the Huggable is designed to be more than a fun robotic companion. Its main purpose is to enhance human relationships by functioning as a visual tool for long-distance communication. Grandparents who want to talk to young grandchildren, teachers instructing students, or healthcare providers communicating with patients could all enrich their interactions using the robot.

The Huggable features more than 1500 sensors on its skin, along with quiet actuators, video cameras in its eyes, microphones in its ears, a speaker in its mouth, and an embedded PC with 802.11g wireless networking.

"The movements, gestures and expressions of the bear convey a personality-rich character, not a robotic artifact," the MIT Media Lab's Web site explains. "A soft silicone-based skin covers the entire bear to give it a more lifelike feel and heft, so you do not feel the technology underneath. Holding the Huggable feels more like holding a puppy, rather than a pillow-like plush doll."

The Huggable connects to a Web interface that enables the remote person to not only view the person on the other end through the bear's eyes, but also view the robot's behaviors through streaming audio and video. The remote person can also control the robot using several features. A grandparent, for instance, can enter text for the robot to speak via speech synthesis or command the robot to make various sounds, such as giggling. The grandparent can then watch the child's facial reaction on the screen and listen to their response, as well as watch a 3D virtual model of the robot and an animated cartoon that indicates gestures, such as when the robot is being bounced or rocked. Overall, the robot enables the grandparent to see and hear the child through the eyes and ears of the Huggable.

The robot can operate in either fully or semi autonomous mode. The Huggable can be programmed to remember the faces of specific people, and can then track the moving faces without external control. In semi-autonomous mode, a user can use a joystick to move the robot's head vertically and horizontally.

The Huggable was originally based on the concept of therapeutic companion animals, and has important touch-based features. The robot's neural network can recognize nine different classes of touch, such as tickling, poking, and scratching, etc., and each class is further divided into six response types, such as teasing pleasant, punishment light, etc. Based on the response type, the robot interprets the intent of the touch and how to respond.

Currently, the MIT Media Lab is working to create a series of Huggables for real-world trials. The Huggable was created using Microsoft Robotic Studio, and the project is supported in part by a Microsoft iCampus grant.

More information: MIT Media Lab

© 2008 PhysOrg.com

Explore further: Robotic system monitors specific neurons

Related Stories

Robotic system monitors specific neurons

August 30, 2017

Recording electrical signals from inside a neuron in the living brain can reveal a great deal of information about that neuron's function and how it coordinates with other cells in the brain. However, performing this kind ...

Spectacular images thanks to an efficient algorithm

August 7, 2017

Filming of spectacular action scenes is expensive and the creative possibilities are often limited. An ETH doctoral student has developed an algorithm that allows drones to implement the desired picture compositions independently.

Meet Nexi, MIT Media Lab's latest robot and Internet star

April 10, 2008

A new experimental robot from the MIT Media Lab can slant its eyebrows in anger, or raise them in surprise, and show a wide assortment of facial expressions to communicate with people in human-centric terms. Called Nexi, ...

New York City launches its own Media Lab

June 15, 2010

New York launched its own Media Lab on Tuesday aiming to build stronger connections between academic researchers and media companies seeking to draw on emerging technologies.

Recommended for you

Scientists write 'traps' for light with tiny ink droplets

October 23, 2017

A microscopic 'pen' that is able to write structures small enough to trap and harness light using a commercially available printing technique could be used for sensing, biotechnology, lasers, and studying the interaction ...

When words, structured data are placed on single canvas

October 22, 2017

If "ugh" is your favorite word to describe entering, amending and correcting data on the rows and columns on spreadsheets you are not alone. Coda, a new name in the document business, feels it's time for a change. This is ...

Enhancing solar power with diatoms

October 20, 2017

Diatoms, a kind of algae that reproduces prodigiously, have been called "the jewels of the sea" for their ability to manipulate light. Now, researchers hope to harness that property to boost solar technology.

7 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

DGBEACH
1 / 5 (1) Dec 17, 2008
I LIKE THIS! Keep the mechanicals separate from the processing, using 802.11g as the connection between the two. This way your pc handles all of the sensor processing, which is really what takes up most of the processor time, and relays the required "responses" to the smaller on-board processor which is what makes the bear move.

The next question is..."Windows or Linux?" :)
LariAnn
1 / 5 (1) Dec 17, 2008
Wow, that's a lot like the "supertoy" Teddy in the movie A. I. Who'd have thought such a development would come along so quickly?
MrGrynch
1 / 5 (1) Dec 17, 2008
Still wondering why they have not incorporated synthetic muscles to replace the actuators. Even 'quiet' actuators make noise. There are many electroreactive polymers that have been researched as synthetic muscle for robotics.

I have to admit, its a cute robot!
RFC
1.5 / 5 (2) Dec 17, 2008
Yes, cute.... until put to the nefarious purposes for which it is truly intended...

Muhahahahahahhaha!!!

"Hi, I'm Teddy! Wanna play!"
Under_Educated
4 / 5 (1) Dec 17, 2008
That picture of the technology in the robot is a little scary, it looks like someone took a knife to its skin. I wonder how it reacted to that. ;)
freemind
not rated yet Dec 18, 2008
joys of technology
Wicked
not rated yet Dec 21, 2008
Creepy Uncle Larry got me one for my birthday.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.