Powerful pixels: Mapping the 'Apollo Zone'

Dec 29, 2011 by Jessica Culler
Mosaic of the near side of the moon as taken by the Clementine star trackers. The images were taken on March 15, 1994. Credit: NASA

(PhysOrg.com) -- Grayscale pixels – up close, they look like black, white or grey squares. But when you zoom out to see the bigger picture, they can create a digital photograph, like this one of our moon.

For NASA researchers, pixels are much more – they are precious data that help us understand where we came from, where we've been, and where we're going.

At NASA's Ames Research Center, Moffett Field, Calif., computer scientists have made a giant leap forward to pull as much information from imperfect static images as possible. With their advancement in image processing algorithms, the legacy data from the Apollo Metric Camera onboard Apollo 15, 16 and 17 can be transformed into an informative and immersive 3D mosaic map of a large and scientifically interesting part of the moon.

The "Apollo Zone" Digital Image Mosaic (DIM) and Digital Terrain Model (DTM) maps cover about 18 percent of the lunar surface at a resolution of 98 feet (30 meters) per pixel. The maps are the result of three years of work by the Intelligent Robotics Group (IRG) at NASA Ames, and are available to view through the NASA Lunar Mapping and Modeling Portal (LMMP) and Google Moon feature in Google Earth.

"The main challenge of the Apollo Zone project was that we had very old data – scans, not captured in digital format," said Ara Nefian, a senior scientist with the IRG and Carnegie Mellon University-Silicon Valley. "They were taken with the technology we had over 40 years ago with imprecise camera positions, orientations and exposure time by today’s standards."

The researchers overcame the challenge by developing new computer vision algorithms to automatically generate the 2D and 3D maps. Algorithms are sets of computer code that create a procedure for how to handle certain set processes. For example, part of the 2D imaging algorithms align many images taken from various positions with various exposure times into one seamless image mosaic. In the mosaic, areas in shadows, which show up as patches of dark or black pixels are automatically replaced by lighter gray pixels. These show more well-lit detail from other images of the same area to create a more detailed map.

Left: A normal one-camera image of the lunar surface. Right: A composite Apollo Zone image showing the best details from multiple photographs. Credit: NASA/Google Earth

"The key innovation that we made was to create a fully automatic image mosaicking and terrain modeling software system for orbital imagery," said Terry Fong, director of IRG. "We have since released this software in several open-source libraries including Ames Stereo Pipeline, Neo-Geography Toolkit and NASA Vision Workbench."

Lunar imagery of varying coverage and resolution has been released for general use for some time. In 2009, the IRG helped Google develop "Moon in Google Earth", an interactive, 3D atlas of the moon. With " in Google Earth", users can explore a virtual moonscape, including imagery captured by the Apollo, Clementine and Lunar Orbiter missions.

The Apollo Zone project uses imagery recently scanned at NASA's Johnson Space Center in Houston, Texas, by a team from Arizona State University. The source images themselves are large – 20,000 pixels by 20,000 pixels, and the IRG aligned and processed more than 4,000 of them. To process the maps, they used Ames' Pleiades supercomputer.

The initial goal of the project was to build large-scale image mosaics and terrain maps to support future lunar exploration. However, the project's progress will have long-lasting technological impacts on many targets of future exploration.

The color on this map represents the terrain elevation in the Apollo Zone mapped area. Credit: NASA/Google Earth

"The algorithms are very complex, so they don't yet necessarily apply to things like real time robotics, but they are extremely precise and accurate," said Nefian. "It's a robust technological solution to deal with insufficient data, and qualities like this make it superb for future exploration, such as a reconnaissance or mapping mission to a Near Earth Object."

Near Earth Objects, or "NEOs" are comets and asteroids that have been attracted by the gravity of nearby planets into orbits in Earth's neighborhood. NEOs are often small and irregular, which makes their paths hard to predict. With these algorithms, even imperfect imagery of a NEO could be transformed into detailed 3D maps to help researchers better understand the shape of it, and how it might travel while in our neighborhood.

In the future, the team plans to expand the use of their algorithms to include imagery taken at angles, rather than just straight down at the surface. A technique called photoclinometry – or "shape from shading" – allows 3D terrain to be reconstructed from a single 2D image by comparing how surfaces sloping toward the sun appear brighter than areas that slope away from it. Also, the team will study imagery not just as pictures, but as physical models that give information about all the factors affect how the final image is depicted.

"As NASA continues to build technologies that will enable future robotic and human exploration, our researchers are looking for new and clever ways to get more out of the data we capture," said Victoria Friedensen, Joint Robotic Precursor Activities manager of the Human Exploration Operations Mission Directorate at NASA Headquarters. "This technology is going to have great benefit for us as we take the next steps."

Explore further: CASIS research set for launch aboard SpaceX mission to space station

More information: lmmp.nasa.gov/

Related Stories

NASA joins Google in mapping the moon

Sep 19, 2007

The National Aeronautics and Space Administration has joined with Google Inc. in producing new higher-resolution lunar imagery and maps.

NASA Ames Leads Robotic Lunar Exploration Program

Nov 17, 2005

On Monday, the 36th anniversary of Apollo 12, the second manned lunar landing, NASA announced that it has assigned management of its Robotic Lunar Exploration Program to NASA Ames Research Center in California's Silicon Valley. ...

NASA and Google Launch Virtual Exploration of Mars

Feb 02, 2009

(PhysOrg.com) -- NASA and Google announced Monday the release of a new Mars mode in Google Earth that brings to everyone's desktop a high-resolution, three-dimensional view of the Red Planet.

Digital archive casts new light on Apollo-era moon pictures

Aug 01, 2007

Nearly 40 years after man first walked on the moon, the complete lunar photographic record from the Apollo project will be accessible to both researchers and the general public on the Internet. A new digital archive – created ...

Recommended for you

Image: Crescent Mimas

9 hours ago

A thin sliver of Mimas is illuminated, the long shadows showing off its many craters, indicators of the moon's violent history.

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

omatumr
1.5 / 5 (8) Dec 29, 2011
The "dark" and "light" regions that we see on the Moon were represented in samples returned by the Apollo program.

The "dark" is finely powdered lunar dirt. It has enormous surface area and captures large quantities of elements that are implanted by the solar wind.

Analysis "dark" material first revealed large-scale mass fractionation that is common to heavy elements like xenon (element #54)and to lightweight elements like neon (element #10).

www.nature.com/na...3a0.html

The "light" highlands material - a Moon rock - is about as interesting as a piece of concrete!

May the Moon Shine on a World at Peace in 2012!
Oliver K. Manuel

Shifty0x88
5 / 5 (1) Dec 30, 2011
So I took a look at some of the source code, and WOW!, that is one amazing piece of software.

Very complex, very generic, and it even has comments!

Now I just wish I had a use for it!