Robots learn how to arrange objects by 'hallucinating' humans into their environment (w/ video)

Jun 21, 2013 by Bob Yirka weblog
Credit: Cornell University

(Phys.org) —A team of robotics engineers working in the Personal Robotics Lab at Cornell University (led by Ashutosh Saxena) has developed a new way to give robots a context-sensitive way to organize a room. Instead of providing the robots with a map, the researchers instead cause the robot to "imagine" how a human being would use objects in a room and then to place them accordingly.

Traditional programming for robots has relied on providing clear instructions on what they are supposed to do—pick something up from one place and put it down in another, for example. To get a to engage in activities that require some degree of , however, would mean giving them some means for doing so. One example would be to ask a robot to enter a room, note objects on a table, and ask that they be arranged on a desk for use by a person. To arrange objects in a way that makes sense to a human being requires some understanding of how people operate. To do that, the Cornell team gave a test robot a means for imagining what a person would look like in the room while using a set of objects.

As an example, the researchers programmed a robot to pick up a coffee mug and from a table and place them on a desk in what would seem the most logical positions based on . To do that, they gave the robot what they call "an ability to hallucinate" humans into the room—the robot "brain" overlays images of stick figure humans onto images of the room. Various poses are considered while the robot "imagines" how a human might make the best use of the mouse and mug. Based on this process, the robot then placed the mouse just to the right of the keyboard (because the is right handed) and the mug a little ways back—within reach, but not so close it might get knocked over unintentionally. The approach mimics what a human would likely do given the same instructions, of course, and that is exactly the point.

This video is not supported by your browser at this time.
Hallucinating humans for robotic scene understanding

The team will be outlining their research findings and progress at the Robotics: Science and Systems 2013 conference in Berlin next week.

Explore further: Future US Navy: Robotic sub-hunters, deepsea pods

More information: Project page: pr.cs.cornell.edu/hallucinatinghumans/

Research papers:
Infinite Latent Conditional Random Fields for Modeling Environments through Humans (PDF)
Hallucinated Humans as the Hidden Context for Labeling 3D Scenes (PDF)

via IEEESpectrum

Related Stories

'Hallucinating' robots arrange objects for human use

Jun 18, 2012

(Phys.org) -- If you hire a robot to help you move into your new apartment, you won't have to send out for pizza. But you will have to give the robot a system for figuring out where things go. The best approach, ...

Recommended for you

A robot prepared for self-awareness

11 hours ago

A year ago, researchers at Bielefeld University showed that their software endowed the walking robot Hector with a simple form of consciousness. Their new research goes one step forward: they have now developed ...

Future US Navy: Robotic sub-hunters, deepsea pods

Mar 28, 2015

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

Virtual robotization for human limbs

Mar 26, 2015

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots on reins could be the 'eyes' of firefighters

Mar 25, 2015

Researchers at King's College London have developed revolutionary reins that enable robots to act like guide dogs, which could enable that firefighters moving through smoke-filled buildings could save vital ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.