A robot learns how to tidy up after you

May 22, 2012 By Bill Steele
A robot places an item in a refrigerator. (Saxena Lab)

(Phys.org) -- Sooner than you think, we may have robots to tidy up our homes.

Researchers in Cornell's Lab have trained a robot to survey a room, identify all the objects, figure out where they belong and put them away.

Their new algorithms -- the underlying methods a computer is programmed to follow -- for identifying and placing objects are described in the May online edition of the International Journal of Robotics, and some aspects of the work were presented at the International Conference on Robotics and Automation May 14 -- 18 in St. Paul, Minn.

Previous work has dealt with placing single objects on a flat surface, said Ashutosh Saxena, assistant professor of . "Our major contribution is that we are now looking at a group of objects, and this is the first work that places objects in non-trivial places," he said.

The new algorithms allow the robot to consider the nature of an object in deciding what to do with it. "It learns not to put a shoe in the ," explained graduate student Yun Jiang. And while a shoe can be placed stably on any , it should go on the floor, not on a table.

The researchers tested placing dishes, books, clothing and toys on tables and in , dish racks, refrigerators and closets. The robot was up to 98 percent successful in identifying and placing objects it had seen before. It was able to place objects it had never seen before, but success rates fell to an average of 80 percent. Ambiguously shaped objects, such as clothing and shoes, were most often misidentified.

How it looks to the robot. To place an object, the robot creates a graphic simulation of how the object will look in its final position, then plots a path to get it there. This robot made mistakes: A plate is positioned upright on a stack of plates, instead of flat, and one hanger is beyond the end of the closet rod. (Saxena Lab)

The robot begins by surveying the room with a Microsoft Kinect 3-D camera, originally made for but now being widely used by robotics researchers. Many images are stitched together to create an overall view of the room, which the robot's computer divides into blocks based on discontinuities of color and shape. The robot has been shown several examples of each kind of object and learns what characteristics they have in common. For each block it computes the probability of a match with each object in its database and chooses the most likely match.

For each object the robot then examines the target area to decide on an appropriate and stable placement. Again it divides a 3-D image of the target space into small chunks and computes a series of features of each chunk, taking into account the shape of the object it's placing. The researchers train the robot for this task by feeding it graphic simulations in which placement sites are labeled as good and bad, and it builds a model of what good placement sites have in common. It chooses the chunk of space with the closest fit to that model.

Finally the robot creates a graphic simulation of how to move the object to its final location and carries out those movements. These are practical applications of computer graphics far removed from gaming and animating movie monsters, Saxena noted.

A robot with a success rate less than 100 percent would still break an occasional dish. Performance could be improved, the researchers say, with cameras that provide higher-resolution images, and by preprogramming the robot with 3-D models of the objects it is going to handle, rather than leaving it to create its own model from what it sees. The robot sees only part of a real object, Saxena explained, so a bowl could look the same as a globe. Tactile feedback from the robot's hand would also help it to know when the object is in a stable position and can be released.

In the future, Saxena says he'd like to add further "context," so the can respond to more subtle features of objects. For example, a computer mouse can be placed anywhere on a table, but ideally it should go beside the keyboard.

This work was supported a Microsoft Faculty Fellowship.

Explore further: Future US Navy: Robotic sub-hunters, deepsea pods

add to favorites email to friend print save as pdf

Related Stories

Robots learn to handle objects, understand places

Sep 02, 2011

(PhysOrg.com) -- Infants spend their first few months learning to find their way around and manipulating objects, and they are very flexible about it: Cups can come in different shapes and sizes, but they ...

Robots learn to pick up oddly shaped objects

May 09, 2012

(Phys.org) -- When Cornell engineers developed a new type of robot hand that could pick up oddly shaped objects it presented a challenge: It was easy for a human operator to choose the best place to take h ...

Robots could improve everyday life, do chores

Sep 21, 2010

(PhysOrg.com) -- They're mundane, yet daunting tasks: Tidying a messy room. Assembling a bookshelf from a kit of parts. Fetching a hairbrush for someone who can't do it herself. What if a robot could do it ...

Teaching robots to identify human activities

Jul 19, 2011

(PhysOrg.com) -- If we someday live in "smart houses" or have personal robots to help around the home and office, they will need to be aware of what humans are doing. You don't remind grandpa to take his arthritis ...

ARMAR-III, the robot that learns via touch (w/ Video)

Nov 17, 2010

(PhysOrg.com) -- Researchers in Europe have created a robot that uses its body to learn how to think. It is able to learn how to interact with objects by touching them without needing to rely on a massive ...

Recommended for you

Future US Navy: Robotic sub-hunters, deepsea pods

Mar 28, 2015

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

Virtual robotization for human limbs

Mar 26, 2015

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots on reins could be the 'eyes' of firefighters

Mar 25, 2015

Researchers at King's College London have developed revolutionary reins that enable robots to act like guide dogs, which could enable that firefighters moving through smoke-filled buildings could save vital ...

Robot revolution will change world of work

Mar 24, 2015

Robots will fundamentally change the shape of the workforce in the next decade but many industries will still need a human touch, a QUT Future of Work Conference has heard.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.