Robonaut—perception in space

March 21, 2017, Texas A&M University
Robonaut—perception in space
Credit: Texas A&M University

In order to remain safe, robots are commonly used to reach what human hands cannot. Often a robot is used to uncover victims from rubble or bring them safely to shore. These helpful hands can even reach a world far beyond our own – outer space.

Dr. Dezhen Song, a professor in the Department of Computer Science and Engineering at Texas A&M University, is working on a collaborative project with NASA's Johnson Space Center to develop localization and mapping algorithms for an astronaut (Robonaut) to make better use of the crew's time, and to perform dangerous tasks in lieu of a human.

To utilize all tools and facilities developed for human astronauts, the team is working together to build a human-like robot with similar body configurations such as arms and hands. Due to the lack of GPS signals, the current Robonaut prototype cannot localize itself in the International Space Station (ISS).

Most tasks performed by Robonaut are limited to the vicinity of the robot. To enable further functionalities, such as transporting items in the ISS or performing panel maintenance, the robot needs to move around the station. This also means it must establish a mental map of the visited region and localize itself in the process. In the field of robotics, this is known as simultaneous localization and mapping (SLAM).

"SLAM is part of the robot perception capability," Song said. "Our study is to try to bring better and more accurate information to the robot to facilitate its decision process so that more smart robots can be developed for different applications. If successful, we can significantly increase the robots' ability in handling different environments, which will have significant impact on manufacturing, daily life, defense and many other areas that can benefit from the increasing capability from mobile robots."

A reliable, low-cost SLAM capability has been an obstacle for many robotic applications in the past. A camera is a low-cost sensor compared to laser range finders, but the drawback to using a camera is lighting and baseline limits in calculating stereo information.

Since cameras measure bearing instead of absolute size, they have difficulty measuring distance.

One idea to combat this is to use two or more cameras with known baselines to provide distance reference; this is known as stereo vision. However, the joint coverage region between fields of views of the two cameras is too limited to be directly useful. Therefore, during the process, Robonaut's head will be activated from side to side. This will allow it to scan the surroundings to enlarge the field of view. By using neck encoder readings, the team can track Robonaut's head scanning motion.

There is an inertial measurement unit (IMU) installed in Robonaut that delivers body movement information. An IMU also helps establish view correspondence when Robonaut is moving. The primary challenge with this research project lies in combining the multiple camera views and other sensors with different, uncertain characteristics to provide robust SLAM results.

Interest in these developments extend beyond the space and aeronautics industry and into one that is a bit more grounded.

"They are interested in using our motion-sensor based technology in detecting railway status for better and low-cost railway maintenance," Song said.

This project first came about in 2005 when the group developed SLAM algorithms for vehicles while developing an autonomous motorcycle for the Defense Advanced Research Projects Agency Grand Challenge.

Along with their partnership with NASA, the group is also collaborating with industry contacts and Texas A&M faculty on their research for Robonaut. They are working with Dr. Tim Davis, a professor in the computer science and engineering department, to improve visual SLAM optimization algorithms using sparse matrices, and Dr. Jun Zou, an associate professor in the electrical and computer engineering department, to develop a new line of ranging and communication sensors for underwater robots.

Explore further: Robot's in-hand eye maps surroundings, determines hand's location

Related Stories

Robonaut 2 set to move freely about space station

March 14, 2014

Legs—yes, legs—are on the manifest for the next SpaceX Dragon flight. The commercial spacecraft is expected to blast off March 16 with appendenges for Robonaut 2 on board, allowing the humanoid to move freely around station. ...

Space Image: Man and machine

March 19, 2012

( -- While Robonaut 2 has been busy testing its technology in microgravity aboard the International Space Station, NASA and General Motors have been working together on the ground to find new ways those technologies ...

Recommended for you

Cryptocurrency rivals snap at Bitcoin's heels

January 14, 2018

Bitcoin may be the most famous cryptocurrency but, despite a dizzying rise, it's not the most lucrative one and far from alone in a universe that counts 1,400 rivals, and counting.

Top takeaways from Consumers Electronics Show

January 13, 2018

The 2018 Consumer Electronics Show, which concluded Friday in Las Vegas, drew some 4,000 exhibitors from dozens of countries and more than 170,000 attendees, showcased some of the latest from the technology world.

Finnish firm detects new Intel security flaw

January 12, 2018

A new security flaw has been found in Intel hardware which could enable hackers to access corporate laptops remotely, Finnish cybersecurity specialist F-Secure said on Friday.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.