Berkeley Lab scientists read the cosmic writing on the wall

Mar 21, 2013
This image shows the Julian Borrill Planck analysis group with CMB Planck data. Left to right: Reijo Keskitalo, Aaron Collier, Julian Borrill and Ted Kisner. Credit: Roy Kaltschmidt, Berkeley Lab

Thanks to a supersensitive space telescope and some sophisticated supercomputing, scientists from the international Planck collaboration have made the closest reading yet of the most ancient story in our universe: the cosmic microwave background (CMB).

Today, the team released preliminary results based on the Planck observatory's first 15 months of data. Using supercomputers at the U.S. Department of Energy's (DOE) National Energy Research Scientific Computing Center (NERSC) Planck scientists have created the most detailed and accurate maps yet of the from the big bang. They reveal that the universe is 100 million years older than we thought with more matter and less .

"These maps are proving to be a goldmine containing stunning confirmations and new puzzles," says Martin White, a Planck scientist and physicist with University of California Berkeley and at Lawrence Berkeley National Laboratory (Berkeley Lab). "This data will form the cornerstone of our for decades to come and spur new directions in research."

Decoding the Cosmos

Written in light shortly after the big bang, the CMB is a faint glow that permeates the cosmos. Studying it can help us understand how our universe was born, its nature, composition and eventual fate. "Encoded in its fluctuations are the parameters of all cosmology, numbers that describe the universe in its entirety," says Julian Borrill, a Planck collaborator and in the Division at Berkeley Lab.

However, CMB surveys are complex and subtle undertakings. Even with the most sophisticated detectors, scientists still need supercomputing to sift the CMB's faint signal out of a noisy universe and decode its meaning.

Hundreds of scientists from around the world study the CMB using supercomputers at NERSC, a DOE user facility based at Berkeley Lab. "NERSC supports the entire international Planck effort," says Borrill. A co-founder of the Computational Cosmology Center (C3) at the lab, Borrill has been developing supercomputing tools for CMB experiments for over a decade. The Planck observatory, a mission of the European Space Agency with significant participation from NASA, is the most challenging yet.

Parked in an artificial orbit about 800,000 miles away from Earth, Planck's 72 detectors complete a full scan of the sky once every six months or so. Observing at nine different frequencies, Planck gathers about 10,000 samples every second, or a trillion samples in total for the 15 months of data included in this first release. In fact, Planck generates so much data that, unlike earlier CMB experiments, it's impossible to analyze exactly, even with NERSC's powerful supercomputers.

Instead, CMB scientists employ clever workarounds. Using approximate methods they are able to handle the Planck data volume, but then they need to understand the uncertainties and biases their approximations have left in the results.

One particularly challenging bias comes from the instrument itself. The position and orientation of the observatory in its orbit, the particular shapes and sizes of detectors (these vary) and even the overlap in Planck's scanning pattern affect the data.

To account for such biases and uncertainties, researchers generate a thousand synthetic (or simulated) copies of the Planck data and apply the same analysis to these. Measuring how the approximations affect this simulated data allows the Planck team to account for their impact on the real data.

Growing Challenges

With each generation of NERSC supercomputers, the Planck team has adapted its software to run on more and more processors, pushing the limits of successive systems while reducing the time it takes to run a greater number of complex calculations.

"By scaling up to tens of thousands of processors, we've reduced the time it takes to run these calculations from an impossible 1,000 years down to a few weeks," says Ted Kisner, a C3 member at Berkeley Lab and Planck scientist. In fact, the team's codes are so demanding that they're often called on to push the limits of new NERSC systems.

Access to the NERSC Global Filesystem and vast online and offline storage has also been key. "CMB data over the last 15 years have grown with Moore's Law, so we expect a two magnitude increase in data in the coming 15 years, too," says Borrill.

In 2007 NASA and DOE negotiated a formal interagency agreement that guaranteed Planck access to NERSC for the duration of its mission. "Without the exemplary interagency cooperation between NASA and DOE, Planck would not be doing the science it's doing today," says Charles Lawrence of NASA's Jet Propulsion Laboratory (JPL). A Planck project scientist, Lawrence leads the U.S. team for NASA.

NASA's Planck Project Office is based at JPL. JPL contributed mission-enabling technology for both of Planck's science instruments. European, Canadian and U.S. Planck scientists work together to analyze the Planck data. More information is online at http://www.nasa.gov/planck and http://www.esa.int/planck.

Explore further: Gravitational waves according to Planck

add to favorites email to friend print save as pdf

Related Stories

NASA extends Kepler, Spitzer, Planck missions

Apr 06, 2012

(Phys.Org) -- NASA is extending three missions affiliated with the Jet Propulsion Laboratory in Pasadena, Calif. -- Kepler, the Spitzer Space Telescope and the U.S. portion of the European Space Agency's ...

Planck Satellite ready to measure the Big Bang

May 11, 2009

(PhysOrg.com) -- The last tests of the Ariane 5 rocket system have been finished and ESA's Planck satellite is sitting ready for launch at the Guiana Space Centre in Kourou. Together with ESA's space telescope ...

Introducing the 'coolest' spacecraft in the universe

Feb 09, 2007

The European Space Agency's Planck mission, which will study the conditions present in our Universe shortly after the Big Bang, is reaching an important milestone with the integration of instruments into the satellite at ...

Recommended for you

Gravitational waves according to Planck

7 hours ago

Scientists of the Planck collaboration, and in particular the Trieste team, have conducted a series of in-depth checks on the discovery recently publicized by the Antarctic Observatory, which announced last spring that it ...

Infant solar system shows signs of windy weather

7 hours ago

Astronomers using the Atacama Large Millimeter/submillimeter Array (ALMA) have observed what may be the first-ever signs of windy weather around a T Tauri star, an infant analog of our own Sun. This may help ...

Finding hints of gravitational waves in the stars

14 hours ago

Scientists have shown how gravitational waves—invisible ripples in the fabric of space and time that propagate through the universe—might be "seen" by looking at the stars. The new model proposes that ...

How gamma ray telescopes work

15 hours ago

Yesterday I talked about the detection of gamma ray bursts, intense blasts of gamma rays that occasionally appear in distant galaxies. Gamma ray bursts were only detected when gamma ray satellites were put ...

The frequency of high-energy gamma ray bursts

16 hours ago

In the 1960s a series of satellites were built as part of Project Vela.  Project Vela was intended to detect violations of the 1963 ban on above ground testing of nuclear weapons.  The Vela satellites were ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

gwrede
5 / 5 (1) Mar 22, 2013
A much better article than the other one I just read here about the same project.
VendicarE
not rated yet Mar 22, 2013
It is a good thing that the Republicans didn't manage to dissolve the Department of Energy. If they had, then this mission would have had to go to China to get the compute cycles needed.