Teachable moments: Robots learn our humanistic ways

Mar 22, 2013 by Bill Steele
Teachable moments: Robots learn our humanistic ways
For easy calculation a robot simplifies the image of a human to a skeleton of dots and lines. The volume around the hands is examined to see what object the human is using. Credit: Saxena Lab

(Phys.org) —Robots can observe human behavior and—like a human baby—deduce a reasonable approach to handling specific objects.

Using new algorithms developed by Cornell researchers, a robot saw a human eating a meal and then cleared the table without spilling liquids or leftover foods. In another test, the robot watched a human taking medicine and fetched a glass of water. After seeing a human pour milk into cereal, the robot decided—on its own—to return the milk to the refrigerator.

Ashutosh Saxena, assistant professor of computer science, graduate student Hema Koppula and colleagues describe their new methods in "Learning Human Activities and Object Affordances from RGB-D Videos," in a forthcoming issue of the International Journal of . Saxena's goal is to develop learning algorithms to enable robots to assist humans at home and in the workplace.

Saxena's team previously programed robots to identify human activities and common objects, using a Microsoft Kinect 3-D . They have put those capabilities together to enable a robot to discover what to do with each object. For example, a water pitcher is "pourable," while a cup is "pour-to." Both are "movable."

Teachable moments: Robots learn our humanistic ways
After observing a person making cereal, left, a robot puts the milk away, right. Credit: Saxena Lab

A robot learns by analyzing video of a human handling an object. It treats the body as a , with the 3-D coordinates of joints and the angles made by limbs tracked from one frame to the next. The sequence is compared with a database of activities, using a that adds up all the probabilities of a match for each step. This allows for the possibility that part of the activity might be hidden from view as the person moves around.

The robot "trains" by observing several different people performing the same activity, breaking it down into sub-activities like reaching, pouring or lifting, and learns what the various versions have in common. Observing a new person, it computes the probability of a match with each of the activities it has learned about and selects the best match.

At the same time, the robot examines the part of the image around the skeleton's hands and compares it to a database of objects, to build a list of activities associated with each object. That's how it knows, for example, that cups and plates are usually held level to avoid spilling their contents.

"It is not about naming the object," Saxena said. "It is about the robot trying to figure out what to do with the object."

Neuroscientists, he noted, suggest that human babies learn in the same way. "As babies we learn to name things later in life, but first we learn how to use them," he explained. Eventually, he added, a robot might be able to learn how to perform the entire human activity.

In experiments, the robot can figure out the use of the object correctly 79 percent of the time. If the has observed humans performing different activities with the same object, it performs new tasks with that object with 100 percent accuracy. As assistive robots become common, Saxena said, they may be able to draw on a large common database of object uses and learn from each other.

Explore further: Posture affects infants' capacity to identify objects, study finds

Related Stories

'Hallucinating' robots arrange objects for human use

Jun 18, 2012

(Phys.org) -- If you hire a robot to help you move into your new apartment, you won't have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, ...

Robots learn to handle objects, understand places

Sep 02, 2011

(PhysOrg.com) -- Infants spend their first few months learning to find their way around and manipulating objects, and they are very flexible about it: Cups can come in different shapes and sizes, but they ...

Teaching robots to identify human activities

Jul 19, 2011

(PhysOrg.com) -- If we someday live in "smart houses" or have personal robots to help around the home and office, they will need to be aware of what humans are doing. You don't remind grandpa to take his arthritis ...

Robots learn to pick up oddly shaped objects

May 09, 2012

(Phys.org) -- When Cornell engineers developed a new type of robot hand that could pick up oddly shaped objects it presented a challenge: It was easy for a human operator to choose the best place to take h ...

Robots could improve everyday life, do chores

Sep 21, 2010

(PhysOrg.com) -- They're mundane, yet daunting tasks: Tidying a messy room. Assembling a bookshelf from a kit of parts. Fetching a hairbrush for someone who can't do it herself. What if a robot could do it ...

Recommended for you

A robot prepared for self-awareness

8 hours ago

A year ago, researchers at Bielefeld University showed that their software endowed the walking robot Hector with a simple form of consciousness. Their new research goes one step forward: they have now developed ...

Future US Navy: Robotic sub-hunters, deepsea pods

Mar 28, 2015

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

Virtual robotization for human limbs

Mar 26, 2015

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots on reins could be the 'eyes' of firefighters

Mar 25, 2015

Researchers at King's College London have developed revolutionary reins that enable robots to act like guide dogs, which could enable that firefighters moving through smoke-filled buildings could save vital ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.