Robots learn to take a proper handoff by following digitized human examples

May 20, 2013

A humanoid robot can receive an object handed to it by a person with something approaching natural, human-like motion thanks to a new method developed by scientists at Disney Research, Pittsburgh in a project partially funded by the International Center for Advanced Communication Technologies (interACT) at Carnegie Mellon University and Karlsruhe Institute of Technology (KIT).

Recognizing that a person is handing something and predicting where the human plans to make the handoff is difficult for a , but the researchers from Disney and KIT solved the problem by using motion capture data with two people to create a database of human motion. By rapidly searching the database, a robot can realize what the human is doing and make a reasonable estimate of where he is likely to extend his hand.

The researchers presented their findings at the IEEE International Conference on Robotics and in Karlsruhe, Germany, where their paper was nominated for a Best Cognitive Robotics Paper Award.

People handing a coat, a package or a tool to a robot will become commonplace if robots are introduced to the workplace and the home, said Katsu Yamane, Disney Research, Pittsburgh senior research scientist. But the technique he developed with Marcel Revfi, an interACT exchange student from KIT, could apply to any number of situations where a robot needs to synchronize its motion with that of a human, such as in a dance.

In the case of accepting a handoff, it's not just sufficient to develop a technique that enables the robot to efficiently find and grasp the object. "If a robot just sticks out its hand blindly, or uses motions that look more robotic than human, a person might feel uneasy working with that robot or might question whether it is up to the task," Yamane explained. "We assume human-like motions are more user-friendly because they are familiar."

Human-like motion is often achieved in robots by using data from people. But that's usually done in tightly scripted situations, based on a single person's movements. For the general passing scenarios envisioned by Yamane, a sampling of motion from at least two people would be necessary and the robot would have to access that database interactively, so it could adjust its motion as the person handing it a package progressively extended her arm.

To enable a robot to access a library of human-to-human passing motions with the speed necessary for robot-human interaction, the researchers developed a hierarchical data structure. Using principal component analysis, the researchers first developed a rough estimate of the distribution of various motion samples. They then grouped samples of similar poses and organized them into a binary-tree structure. With a series of "either/or" decisions, the robot can rapidly search this database, so it can recognize when the person initiates a handing motion and then refine its response as the person follows through.

The team tested their method using computer simulations and, because it is essential to include a human in the loop, with the upper body of a . They confirmed that the robot began moving its arm before the human's hand reached his desired passing location and that the robot's hand position roughly matched that of the human receivers from the database that it was attempting to mimic.

Yamane said further work is necessary to expand the database for a wider variety of passing motions and passing distances. As more capable hardware becomes available, the researchers hope to add finger motions and secondary behaviors that would make the robot's motion more engaging. They also plan to explore new applications for the method.

Explore further: Researchers use gait primitives from real animals to simulate movement in robots (w/ video)

More information: Project: www.disneyresearch.com/project/objectreceivingrobots

add to favorites email to friend print save as pdf

Related Stories

Teaching robots to move like humans (w/ Video)

Mar 07, 2011

When people communicate, the way they move has as much to do with what they're saying as the words that come out of their mouths. But what about when robots communicate with people? How can robots use non-verbal ...

Recommended for you

A robot dives into search for Malaysian Airlines flight

Apr 18, 2014

In the hunt for signs of Malaysian Airlines flight MH370—which disappeared on March 8 after deviating for unknown reasons from its scheduled flight path—all eyes today turn to a company that got its start ...

Simplicity is key to co-operative robots

Apr 16, 2014

A way of making hundreds—or even thousands—of tiny robots cluster to carry out tasks without using any memory or processing power has been developed by engineers at the University of Sheffield, UK.

Students turn $250 wheelchair into geo-positioning robot

Apr 16, 2014

Talk about your Craigslist finds! A team of student employees at The University of Alabama in Huntsville's Systems Management and Production Center (SMAP) combined inspiration with innovation to make a $250 ...

Using robots to study evolution

Apr 14, 2014

A new paper by OIST's Neural Computation Unit has demonstrated the usefulness of robots in studying evolution. Published in PLOS ONE, Stefan Elfwing, a researcher in Professor Kenji Doya's Unit, has succes ...

User comments : 0

More news stories

Ex-Apple chief plans mobile phone for India

Former Apple chief executive John Sculley, whose marketing skills helped bring the personal computer to desktops worldwide, says he plans to launch a mobile phone in India to exploit its still largely untapped ...

A homemade solar lamp for developing countries

(Phys.org) —The solar lamp developed by the start-up LEDsafari is a more effective, safer, and less expensive form of illumination than the traditional oil lamp currently used by more than one billion people ...

NASA's space station Robonaut finally getting legs

Robonaut, the first out-of-this-world humanoid, is finally getting its space legs. For three years, Robonaut has had to manage from the waist up. This new pair of legs means the experimental robot—now stuck ...