Cooperative robots that learn = less work for human handlers (w/ video)

Jun 28, 2011 By Miles O'Brien and Marsha Walton
Clifford I. Nass of Stanford University and Robin Murphy of Texas A&M University are exploring ways to make rescue robots more user-friendly by incorporating lessons learned from studies of how humans interact with technology. Rescue robots serve as a trapped disaster victim's lifeline to the outside world. But they are worthless if the victim finds them scary, bossy, out-of-control or just plain creepy. Credit: Texas A&M University

(PhysOrg.com) -- Learning a language can be difficult for some, but for babies it seems quite easy. With support from the National Science Foundation (NSF), linguist Jeffrey Heinz and mechanical engineer Bert Tanner have taken some cues from the way humans learn to put words and thoughts together, and are teaching language to robots.

This innovative collaboration began a few years ago at a meeting at the University of Delaware. The event, organized by the dean's office, brought recently hired faculty in arts and sciences, and engineering together; each gave a one-minute slide presentation about his or her research.

"That's how we became aware of what each other was doing," says Tanner. "We started discussing ideas about how we could collaborate, and the NSF project came as a result of that. Once we started seeing things aligning with each other and clicking together, we thought, 'Oh, maybe we really have something here that the world needs to know about.'"

This video is not supported by your browser at this time.

One goal for this project is to design cooperative robots that can operate autonomously and communicate with each other in dangerous situations, such as in a fire or at a disaster site.

Robots at a building fire, for instance, could assess the situation and take the most appropriate action without a direct command from a human.

"We would like to make the robots adaptive--learn about their environment and reconfigure themselves based on the knowledge they acquire," explains Tanner.

He hopes one robot could follow another robot, watch what it was doing and infer it should be doing the same thing.

"The robots will be designed to do different tasks," says Heinz. "We have eyes that see, ears that hear and we have fingers that touch. We don’t have a 'universal sensory organ.' Likewise in the robotics world, we're not going to design a universal robot that's going to be able to do anything and everything."

Each robot will play to its talents and strengths. The robots need to be aware of their own capabilities, those of the other robots around them and the overall goal of their mission.

"If the two robots are working together, then you can have one [ground robot] that can open the door and one [flying robot] that can fly through it," says Heinz. "In that sense, those two robots working together can do more than if they were working independently."

The researchers base their strategy for robot language on the structure of human languages.

Words and sentences in every language have complex rules and structures--do's and don'ts for what letters and sounds can be put in what order.

"So a robot's "sentence" in a sense, is just a sequence of actions that it is conducting," says Heinz. "And there will be constraints on the kinds of sequences of actions that a robot can do."

"For example," Heinz says, "In Latin, you can have Ls and Rs in words, but for the most part, you can't have two non-adjacent Ls unless you have an R in between them."

A robot's language follows a similar pattern. If a robot can do three things--move, grasp an object in its claw, and release that object, it can only do those things in a certain order. It cannot grasp an object twice in a row.

"If the robot has something in its claw, it cannot possibly hold another thing at the same time," says Tanner. "It has to lay it down before it picks up something else. We are trying to teach a these kinds of constraints, and this is where some of the techniques we find in linguistics come into play."

Heinz and Tanner want the robots to be able to answer questions about events happening around them. "Is the fire going to spread this way or that way? Is some unknown agent going to move this way or that way? These kinds of things," says Heinz.

And in designing solutions to the challenges in disaster situations, timing is crucial.

"There's always a tradeoff in how much you can compute, and how many solutions you can afford to lose," says Tanner. "For us, the priority is on being able to compute reasonable solutions quickly, rather than searching the whole possible space of solutions."

Heinz's and Tanner's research is helping to prove that two heads--or in this case two circuit boards--are better than one.

Explore further: Socially-assistive robots help kids with autism learn by providing personalized prompts

Related Stories

Thanks to RoboEarth the bots can learn on their own

Feb 10, 2011

(PhysOrg.com) -- We tend to assume that robots need human input in order to understand the world around them. In the near future humans may not even be a part of the robotic-learning equation. Soon, robots ...

iRobot planning an Android-based robot

May 12, 2011

(PhysOrg.com) -- iRobot is working on robots that have the brains of an Android tablet. The goal is an Android-based tablet that is able to see the world around it, hear input from humans, respond and think ...

Robots learn to create language

May 17, 2011

(PhysOrg.com) -- Communication is a vital part of any task that has to be done by more than one individual. That is why humans in every corner of the world have created their own complex languages that help ...

Kilobots bring us one step closer to a robot swarm

Jun 17, 2011

(PhysOrg.com) -- When you think about robots, the odds are that you think about something that is fairly large. Maybe you picture a robot arms bolted to the floor of a factory or if you are feeling particularly ...

Recommended for you

FIXD tells car drivers via smartphone what is wrong

13 hours ago

A key source of anxiety while driving solo, when even a bothersome back-seat driver's comments would have made you listen: the "check engine" light is on but you do not feel, smell or see anything wrong. ...

Watching others play video games is the new spectator sport

19 hours ago

As the UK's largest gaming festival, Insomnia, wrapped up its latest event on August 25, I watched a short piece of BBC Breakfast news reporting from the festival. The reporter and some of the interviewees appeared baff ...

User comments : 0