Beer-pouring robot programmed to anticipate human actions (w/ Video)

May 28, 2013

A robot in Cornell's Personal Robotics Lab has learned to foresee human action in order to step in and offer a helping hand, or more accurately, roll in and offer a helping claw.

Understanding when and where to pour a beer or knowing when to offer assistance opening a refrigerator door can be difficult for a because of the many variables it encounters while assessing the situation. A team from Cornell has created a solution.

Gazing intently with a Kinect 3-D camera and using a database of 3D videos, the Cornell robot identifies the activities it sees, considers what uses are possible with the objects in the scene and determines how those uses fit with the activities. It then generates a set of possible continuations into the future – such as eating, drinking, cleaning, putting away – and finally chooses the most probable. As the action continues, the robot constantly updates and refines its predictions.

This video is not supported by your browser at this time.

"We extract the general principles of how people behave," said Ashutosh Saxena, Cornell professor of and co-author of a new study tied to the research. "Drinking coffee is a big activity, but there are several parts to it." The robot builds a "" of such small parts that it can put together in various ways to recognize a variety of big activities, he explained.

Saxena will join Cornell graduate student Hema S. Koppula as they present their research at the International Conference of Machine Learning, June 18-21 in Atlanta, and the Robotics: Science and Systems conference June 24-28 in Berlin, Germany.

In tests, the robot made correct predictions 82 percent of the time when looking one second into the future, 71 percent correct for three seconds and 57 percent correct for 10 seconds.

"Even though humans are predictable, they are only predictable part of the time," Saxena said. "The future would be to figure out how the robot plans its action. Right now we are almost hard-coding the responses, but there should be a way for the robot to learn how to respond."

Explore further: Socially-assistive robots help kids with autism learn by providing personalized prompts

Related Stories

Teaching robots to identify human activities

Jul 19, 2011

(PhysOrg.com) -- If we someday live in "smart houses" or have personal robots to help around the home and office, they will need to be aware of what humans are doing. You don't remind grandpa to take his arthritis ...

Robots learn to handle objects, understand places

Sep 02, 2011

(PhysOrg.com) -- Infants spend their first few months learning to find their way around and manipulating objects, and they are very flexible about it: Cups can come in different shapes and sizes, but they ...

'Hallucinating' robots arrange objects for human use

Jun 18, 2012

(Phys.org) -- If you hire a robot to help you move into your new apartment, you won't have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, ...

Recommended for you

Tablet sales slow as PCs find footing

27 minutes ago

Tablets won't eclipse personal computers as fast as once thought, according to studies by market tracker International Data Corporation (IDC).

Startups offer banking for smartphone users

36 minutes ago

The latest banks are small enough to fit in the palm of your hand. Startups, such as Moven and Simple, offer banking that's designed specifically for smartphones, enabling users to track their spending on the go. Some things ...

FIXD tells car drivers via smartphone what is wrong

14 hours ago

A key source of anxiety while driving solo, when even a bothersome back-seat driver's comments would have made you listen: the "check engine" light is on but you do not feel, smell or see anything wrong. ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

grondilu
5 / 5 (1) May 28, 2013
I can already hear bar tenders shouting "they took our jobs".
ValeriaT
5 / 5 (1) May 28, 2013
What will happen, when robot doesn't anticipate the human action well..