Robonaut—perception in space

March 21, 2017, Texas A&M University
Robonaut—perception in space
Credit: Texas A&M University

In order to remain safe, robots are commonly used to reach what human hands cannot. Often a robot is used to uncover victims from rubble or bring them safely to shore. These helpful hands can even reach a world far beyond our own – outer space.

Dr. Dezhen Song, a professor in the Department of Computer Science and Engineering at Texas A&M University, is working on a collaborative project with NASA's Johnson Space Center to develop localization and mapping algorithms for an astronaut (Robonaut) to make better use of the crew's time, and to perform dangerous tasks in lieu of a human.

To utilize all tools and facilities developed for human astronauts, the team is working together to build a human-like robot with similar body configurations such as arms and hands. Due to the lack of GPS signals, the current Robonaut prototype cannot localize itself in the International Space Station (ISS).

Most tasks performed by Robonaut are limited to the vicinity of the robot. To enable further functionalities, such as transporting items in the ISS or performing panel maintenance, the robot needs to move around the station. This also means it must establish a mental map of the visited region and localize itself in the process. In the field of robotics, this is known as simultaneous localization and mapping (SLAM).

"SLAM is part of the robot perception capability," Song said. "Our study is to try to bring better and more accurate information to the robot to facilitate its decision process so that more smart robots can be developed for different applications. If successful, we can significantly increase the robots' ability in handling different environments, which will have significant impact on manufacturing, daily life, defense and many other areas that can benefit from the increasing capability from mobile robots."

A reliable, low-cost SLAM capability has been an obstacle for many robotic applications in the past. A camera is a low-cost sensor compared to laser range finders, but the drawback to using a camera is lighting and baseline limits in calculating stereo information.

Since cameras measure bearing instead of absolute size, they have difficulty measuring distance.

One idea to combat this is to use two or more cameras with known baselines to provide distance reference; this is known as stereo vision. However, the joint coverage region between fields of views of the two cameras is too limited to be directly useful. Therefore, during the process, Robonaut's head will be activated from side to side. This will allow it to scan the surroundings to enlarge the field of view. By using neck encoder readings, the team can track Robonaut's head scanning motion.

There is an inertial measurement unit (IMU) installed in Robonaut that delivers body movement information. An IMU also helps establish view correspondence when Robonaut is moving. The primary challenge with this research project lies in combining the multiple camera views and other sensors with different, uncertain characteristics to provide robust SLAM results.

Interest in these developments extend beyond the space and aeronautics industry and into one that is a bit more grounded.

"They are interested in using our motion-sensor based technology in detecting railway status for better and low-cost railway maintenance," Song said.

This project first came about in 2005 when the group developed SLAM algorithms for vehicles while developing an autonomous motorcycle for the Defense Advanced Research Projects Agency Grand Challenge.

Along with their partnership with NASA, the group is also collaborating with industry contacts and Texas A&M faculty on their research for Robonaut. They are working with Dr. Tim Davis, a professor in the computer science and engineering department, to improve visual SLAM optimization algorithms using sparse matrices, and Dr. Jun Zou, an associate professor in the electrical and computer engineering department, to develop a new line of ranging and communication sensors for underwater robots.

Explore further: Robot's in-hand eye maps surroundings, determines hand's location

Related Stories

Robonaut 2 set to move freely about space station

March 14, 2014

Legs—yes, legs—are on the manifest for the next SpaceX Dragon flight. The commercial spacecraft is expected to blast off March 16 with appendenges for Robonaut 2 on board, allowing the humanoid to move freely around station. ...

Space Image: Man and machine

March 19, 2012

( -- While Robonaut 2 has been busy testing its technology in microgravity aboard the International Space Station, NASA and General Motors have been working together on the ground to find new ways those technologies ...

Recommended for you

Printing microelectrode array sensors on gummi candy

June 22, 2018

Microelectrodes can be used for direct measurement of electrical signals in the brain or heart. These applications require soft materials, however. With existing methods, attaching electrodes to such materials poses significant ...

EU copyright law passes key hurdle

June 20, 2018

A highly disputed European copyright law that could force online platforms such as Google and Facebook to pay for links to news content passed a key hurdle in the European Parliament on Wednesday.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.