Students' autonomous robot project could be a lifesaver

Jul 30, 2014 by Jim Steele
Mechanical and Aerospace Engineering graduate student Sai susheel praneeth Kode, left, and his research assistant, rising sophomore Computer Engineering major Tevon Walker, with the robot they use for research. The box atop the machine is stuffed with an onboard computer and an Inertial Measurement Unit navigation sensor. At left is the white Ethernet cable used to link the robot to a desktop computer for task programming. The robot runs 30-40 minutes on battery power. Credit: Michael Mercier | UAH

The building is on fire but the firefighters are unsure about what's fueling it or how hazardous the situation is. They place a robot at the entrance and program in a rudimentary set of directions using a building map or even the recall of an occupant.

The starts into the building but discovers a discrepancy between the mapped coordinates and the actual layout. Game over? Not if the current experimental work of two University of Alabama in Huntsville (UAH) students produces fruit.

Advised by Mechanical and Aerospace Engineering assistant professor Dr. Farbod Fahimi, whose guidance they say has been invaluable, graduate student Sai susheel praneeth Kode and his research assistant, undergraduate Computer Engineering major Tevon Walker, are exploring ways to create an electronic "neural network" between the robot and the operator that will help empower both by releasing the robot to perform tasks assigned by the controller within certain parameters.

The , as they call it, would be able to detect and act on anomalies between what the operator has told it is the path to its destination and the actual conditions it finds. It would relay that information to the operations base via the network and adjust to complete its mission. That goal is still on the research horizon for the pair.

Walker, who as a rising sophomore says he is "very fortunate" to work on such high-level robotics at UAH, says right now the pair can control the robot with a set of directions.

"Right now, we can give it the inputs and it executes it, but it doesn't know how accurate it is," Walker says. "When we add the autonomous learning control, it will allow it to automatically adjust for errors."

The robot is equipped with an onboard computer, a solid-state drive and an Inertial Measurement Unit navigation sensor. The sensor can detect its heading, pitch and roll, GPS position, velocity and acceleration rate.

"We program the paths into it using an Ethernet cord - straight line, circle or whatever pathway we want it to take," Kode says. "Once we set it down, the GPS figures out its position and the computer and the learning algorithm execute the path."

A neural network would delink the cause-and-effect relationship between the operator's exact coordinates and the robot, in essence untethering it to work within certain parameters to learn how to move on its own while still remaining under the operator's control.

"There's no direct mathematical correlation between the robot and the control computer in what we are trying to do," says Kode, who for his thesis is working on the mathematical computations necessary for the software needed so the robot can "decide" about discrepancies between its directions and what it actually finds. "We are doing a number of tests to determine what this relationship is. Whatever we program to this robot, we then take it outside so that we can analyze how it is doing relative to what we want. When we match the robot and the computer through the neural network, then it will be possible."

For the robot, the will "bridge the gap between the input and output signals," Walker says. Using differential equations, the programming will allow the robot to "tell you what it is doing."

"It tries to figure out what you are telling it," says Kode. "And since we are working on a learning mathematical model, it will work with any number of robotic configurations."

Untethering a remotely controlled robot from direct operational cause-and-effect could open new worlds when it comes to accessibility for operators and allow the robot to do its task even when faced with unforeseen obstacles, the students say. In a perfected system, the device could move out of the laboratory and the hands of specialized researchers and into the hands of rescuers like that firefighter, who could operate it with rudimentary training. Once there, it could save lives.

Explore further: 'Humans' star William Hurt says AI sentience is 'inevitable'

Related Stories

The human touch makes robots defter

Nov 07, 2013

Cornell engineers are helping humans and robots work together to find the best way to do a job, an approach called "coactive learning."

The robot and its virtual twin (w/ Video)

Aug 12, 2013

It walks, falls, and gets up on his own, follows a ball by sight and shoots. It also knows how to do many other things. The small humanoid robot DARwIn-OP was developed by a Korean company and three U.S. universities to serve ...

Recommended for you

Autonomous robot Myon joins the cast at a Berlin opera

Jul 02, 2015

"My Square Lady" last month opened in Berlin at the Komische Oper. The outstanding feature about this production is that a character named Myon plays a key role on stage, and Myon is a robot—of the white, ...

Autonomous Robird is one step closer

Jul 01, 2015

With the assistance of the European Space Agency ESA, robotics researchers at the University of Twente have taken an essential step toward the Robird's completely autonomous flight. This lifelike, robotic ...

Four reasons why the Terminator is already here

Jul 01, 2015

As Terminator: Genisys hits cinemas around the world, ScienceNetwork WA looks at some of the feats performed by robots in the Terminator films, and investigates how long until reality catches up with scienc ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.