In the eye of the beholder

In the eye of the beholder
A TextureCam analysis of a Mars image is able to distinguish rocks from soil. Credit: NASA/JPL/Caltech/Cornell

Astrobiologists are developing 'intelligent' instruments that could help future robotic explorers make their own decisions about where and how to collect data. Although focused on Mars exploration for the time being, the technology could benefit missions throughout the Solar System.

Researchers supported by NASA's Astrobiology Science and Technology Instrument Development (ASTID) program are designing algorithms and instruments that could help future robotic missions make their own decisions about surface sites to explore on other planets. One such instrument is the TextureCam, which is currently being tested with Mars in mind. The technology will improve the efficiency of planetary missions, allowing rovers to collect more data and perform more experiments in less time.

Costing by the second

Robotic explorers are designed to be as tough as possible, and they can survive for a long time - even in some of the most extreme conditions the Solar System has to offer. For instance, the Mars Exploration Rover (MER), Opportunity, landed on Mars in 2004 for what was scheduled to be a 3-month mission. A decade later, the robust explorer is still driving across the surface of the Red Planet and collecting valuable data.

However, Opportunity isn't the norm. It's sister , Spirit, also kept going and going for an impressive 6 years, but has been silent on Mars since March of 2010. And consider the Huygens lander, which parachuted down to the surface of Saturn's moon Titan in 2005. After spending more than 6 years in transit from Earth to the saturnian moon, Huygens' trip through Titan's dense atmosphere lasted only two and half hours. Then there's the Soviet Venus lander, Venera 7, which was only able to send back 23 minutes of data before it was destroyed by the harsh venusian environment.

Without mechanics and engineers around to rescue them, no robotic explorer can survive on a distant planet forever.

Robotic missions take an incredible amount of time and effort to build, launch and operate. Hundreds (or sometimes thousands) of people spend years of their lives piecing missions together. This, coupled with the limited lifespan of a robot, means that every second it spends during its mission is incredibly valuable – and scientists want to get the most for their money.

On the clock maneuvers

Driving a rover around on another planet is an extremely complicated process. Basically, the robot takes a picture of the landscape in front of it, and then transmits the image back to Earth. Teams of scientists pour over the image looking for interesting sites where the rover can collect data.

Then, mission planners have to decide a safe route for the rover to follow, mapping every little pebble, rock or towering cliff face that might get in its way.

Commands are sent to the rover that explain exactly how it will get from point A to point B.

The rover begins to drive… and everyone on Earth holds their breath. Any mistake on the part of the scientists and mission planners, and the rover might go tumbling down a crater wall.

When the rover finishes driving, it takes another picture and sends it home so that the mission team can see whether or not the drive was successful - and then they start planning where to go next.

This process is like taking baby steps across the surface, and it eats up a lot of time. It also means that the rover can't actually travel very far each day, because each step it takes needs to be meticulously planned and translated into commands. This is compounded by the fact that messages can take 20 minutes to travel between the Earth and Mars (as an example), and bandwidth limitations currently limit the number of messages that can be sent.

"It's important to note that communications with planetary spacecraft typically happen just once or twice per day, and in the meantime they normally execute scripted command sequences," says Dr. David Thompson, principle investigator on TextureCam. "Adding rudimentary onboard autonomy lets the rover adapt its actions to new data."

When the rover is able to make some decisions on its own, or identify specific targets of interest, it can greatly speed the exploration process along.

"Roughly speaking, instead of telling the rover to "drive over the hill, turn left 90 degrees and take a picture," you might tell it to "drive over the hill and take pictures of all the rocks you see," explains Thompson.

In the eye of the beholder
A photo of a stromatolite (left) from Western Australia analyzed by TextureCam (right). The program assigns a color to each patch in the image according to how it matches the criteria for stromatolite rocks (red means good match, or high probability). Credit: NASA/JPL

In September of 2013, NASA's Curiosity rover, a component of the Mars Science Laboratory (MSL) mission, made its longest single-day drive (up until that point). The rover dashed a total 464 feet (141.5 meters). But even though Curiosity is the most advanced rover to touch down on Mars, it's no sprinter. A child under eight, for example, can travel farther in 20 seconds than Curiosity can in an entire day (the U.S. sprinting record for a child under age 8 is more than 200 meters in 28.2 seconds). Though the analogy is imprecise, it's safe to say that current robots are still severely limited in mobility when compared to their human counterparts.

"This will be particularly valuable for rover astrobiology missions involving wide-area surveys (seeking rare evidence of habitability)," says Thompson. "Here, efficiency improvements can be really enabling since they let us survey faster and visit more locations over the lifetime of the spacecraft."

Robotic Field Assistants

Autonomous techniques on past and current missions have already helped capture opportunistic scientific data that would have otherwise been missed. One example is images of dust devils on the surface of Mars that were captured by the Mars Exploration Rovers.

"Over time, spacecraft have been getting smarter and more autonomous with respect to both mobility and science data analysis," says Thompson. "This process is making them more active exploration partners, and making it more efficient to collect good quality data."

Curiosity is more autonomous than any other rover yet built for space exploration, and is able to perform some of its own navigation without commands from Earth. When engineers tell the rover where to drive, the rover itself uses software to figure out how to navigate obstacles and travel from A to B.

However, Curiosity doesn't set its own agenda. Teams of scientists are still required to examine images and select scientific targets. This means the rover has to go slowly, allowing human eyes time to examine the surroundings and look for anything of interest.

Thompson and his team at NASA's Jet Propulsion Laboratories (JPL) are working on some clever ways to further automate planetary rovers by allowing the robots to select scientifically interesting sites on their own. This involves 'smart' instruments on the rover – instruments that can 'think' for themselves.

They are currently developing an instrument called TextureCam, which can pick out geologically interesting rocks all by itself. It works by classifying pixels in an image to identify variations in the texture of rocks. When TextureCam spots something that looks interesting, it knows right away that its okay to get a bit snap-happy with its camera.

By taking extra pictures and sending them back to Earth, scientists can immediately begin to assess whether or not the rock is a good target for extended study rather than taking extra days out of the rovers schedule to collect the additional images.

The rover then becomes a more efficient assistant for humans on Earth, giving them more time to concentrate on the science rather than the logistics of exploration.

Future Paths

The technology behind TextureCam could play a major role in astrobiology research on the surface of planets like Mars. According to Thompson, the team is working on methods for uploading the algorithms they've developed to the Mars Science Laboratory, possibly as part of an extended mission. Opportunities to demonstrate the technology could also come with NASA's planned Mars 2020 rover. But Mars isn't the only place where their work could be applied.

"We think there's value for a wide range of missions. Science autonomy is a concept that applies to any instrument," explains Thompson. "It can benefit missions whenever there are restricted bandwidth communications, or transient events that require immediate action from the spacecraft."

In the eye of the beholder
A martian dust devil captured in an image by the MER Spirit rover around March 10, 2005. Credit: NASA/JPL

Thompson even has some specific ideas about where this technology might be useful in our solar system.

"The general idea of onboard science data analysis could apply to other planetary exploration scenarios," says Thompson. "It could be useful for short mission segments, such as a future Venus landing or a deep space flyby, which happen too quickly for a communications cycle with ground control."

"Speculating a bit," Thompson continued, "autonomy could also be useful for scenarios like Titan boats and balloons if they travel long distances between communications."

The algorithms that are being developed by the team could also have a number of applications in areas closer to home.

"NASA Earth science missions might benefit from this research too," explains Thompson. "For example, typically over half of the planet is covered by clouds, which complicates remote sensing by orbital satellites. One can save downlink bandwidth by excising these clouded scenes onboard, or - when the spacecraft is capable - aiming the sensor at the cloud-free areas."

This automated cloud detection could be useful in detecting weather patterns of interest to climatologists and meteorologists.

"We're currently investigating the use of onboard image analysis to recognize clouds and terrain," says Thompson. "Specifically, we're running experiments onboard the IPEX cubesat. They are the same algorithms we use for autonomous astrobiology, but they turned out to be quite useful for Earth missions as well."

Earth Technology for Space

In space science, there are many examples of how technologies developed for space cross over into our everyday lives here on Earth – from Velcro to laptop computers. TextureCam is actually a good example of how this technology cross-over can also happen in the other direction.

"It bears mention that the computer vision strategies we're using are similar to object recognition methods used by commercial sensors and robots," says Thompson.

Many industries, and even household objects, are using similar technologies to automate various processes – like manufacturing… or vacuuming our living room carpets.

"The idea of using machine learning for image analysis is not a new one!" says Thompson. "It's great that we can leverage some of these ideas for NASA planetary missions."

Instruments for Astrobiology

ASTID (Astrobiology Science and Technology Instrument Development) was an element of NASA's Astrobiology Program. ASTID provided funding support for the development of instruments used for astrobiology studies on and on Earth. NASA's Planetary Science Division recently restructured its instrument development programs, and ASTID has now been incorporated into two new division-wide programs – dubbed PICASSO and MatISSE.

The jargon can be a bit complicated, but development of an instrument at NASA is basically broken down into a number of phases called Technology Readiness Levels (TRLs). PICASSO (Planetary Instrument Concepts for the Advancement of Solar System Observations Program) supports the early development of instrument concepts (TRLs 1-3). MatISSE (Maturation of Instruments for Solar System Exploration) supports the development of instruments that are closer to being ready for a mission (TRLs 4-6).

Provided by Astrobio.net

Citation: In the eye of the beholder (2014, January 24) retrieved 26 April 2024 from https://phys.org/news/2014-01-eye_1.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

NASA receives Mars 2020 rover instrument proposals for evaluation

0 shares

Feedback to editors