Students' autonomous robot project could be a lifesaver

Jul 30, 2014 by Jim Steele
Mechanical and Aerospace Engineering graduate student Sai susheel praneeth Kode, left, and his research assistant, rising sophomore Computer Engineering major Tevon Walker, with the robot they use for research. The box atop the machine is stuffed with an onboard computer and an Inertial Measurement Unit navigation sensor. At left is the white Ethernet cable used to link the robot to a desktop computer for task programming. The robot runs 30-40 minutes on battery power. Credit: Michael Mercier | UAH

The building is on fire but the firefighters are unsure about what's fueling it or how hazardous the situation is. They place a robot at the entrance and program in a rudimentary set of directions using a building map or even the recall of an occupant.

The starts into the building but discovers a discrepancy between the mapped coordinates and the actual layout. Game over? Not if the current experimental work of two University of Alabama in Huntsville (UAH) students produces fruit.

Advised by Mechanical and Aerospace Engineering assistant professor Dr. Farbod Fahimi, whose guidance they say has been invaluable, graduate student Sai susheel praneeth Kode and his research assistant, undergraduate Computer Engineering major Tevon Walker, are exploring ways to create an electronic "neural network" between the robot and the operator that will help empower both by releasing the robot to perform tasks assigned by the controller within certain parameters.

The , as they call it, would be able to detect and act on anomalies between what the operator has told it is the path to its destination and the actual conditions it finds. It would relay that information to the operations base via the network and adjust to complete its mission. That goal is still on the research horizon for the pair.

Walker, who as a rising sophomore says he is "very fortunate" to work on such high-level robotics at UAH, says right now the pair can control the robot with a set of directions.

"Right now, we can give it the inputs and it executes it, but it doesn't know how accurate it is," Walker says. "When we add the autonomous learning control, it will allow it to automatically adjust for errors."

The robot is equipped with an onboard computer, a solid-state drive and an Inertial Measurement Unit navigation sensor. The sensor can detect its heading, pitch and roll, GPS position, velocity and acceleration rate.

"We program the paths into it using an Ethernet cord - straight line, circle or whatever pathway we want it to take," Kode says. "Once we set it down, the GPS figures out its position and the computer and the learning algorithm execute the path."

A neural network would delink the cause-and-effect relationship between the operator's exact coordinates and the robot, in essence untethering it to work within certain parameters to learn how to move on its own while still remaining under the operator's control.

"There's no direct mathematical correlation between the robot and the control computer in what we are trying to do," says Kode, who for his thesis is working on the mathematical computations necessary for the software needed so the robot can "decide" about discrepancies between its directions and what it actually finds. "We are doing a number of tests to determine what this relationship is. Whatever we program to this robot, we then take it outside so that we can analyze how it is doing relative to what we want. When we match the robot and the computer through the neural network, then it will be possible."

For the robot, the will "bridge the gap between the input and output signals," Walker says. Using differential equations, the programming will allow the robot to "tell you what it is doing."

"It tries to figure out what you are telling it," says Kode. "And since we are working on a learning mathematical model, it will work with any number of robotic configurations."

Untethering a remotely controlled robot from direct operational cause-and-effect could open new worlds when it comes to accessibility for operators and allow the robot to do its task even when faced with unforeseen obstacles, the students say. In a perfected system, the device could move out of the laboratory and the hands of specialized researchers and into the hands of rescuers like that firefighter, who could operate it with rudimentary training. Once there, it could save lives.

Explore further: Ask the crowd: Robots learn faster, better with online helpers

Related Stories

The human touch makes robots defter

Nov 07, 2013

Cornell engineers are helping humans and robots work together to find the best way to do a job, an approach called "coactive learning."

The robot and its virtual twin (w/ Video)

Aug 12, 2013

It walks, falls, and gets up on his own, follows a ball by sight and shoots. It also knows how to do many other things. The small humanoid robot DARwIn-OP was developed by a Korean company and three U.S. universities to serve ...

Recommended for you

Future US Navy: Robotic sub-hunters, deepsea pods

Mar 28, 2015

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

Virtual robotization for human limbs

Mar 26, 2015

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots on reins could be the 'eyes' of firefighters

Mar 25, 2015

Researchers at King's College London have developed revolutionary reins that enable robots to act like guide dogs, which could enable that firefighters moving through smoke-filled buildings could save vital ...

Robot revolution will change world of work

Mar 24, 2015

Robots will fundamentally change the shape of the workforce in the next decade but many industries will still need a human touch, a QUT Future of Work Conference has heard.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.