A toolbox to simulate the big bang and beyond

Oct 18, 2013
A toolbox to simulate the big bang and beyond
Credit: Ralf Kaehler and Tom Abel (visualization); John Wise and Tom Abel

The universe is a vast and mysterious place, but thanks to high-performance computing technology scientists around the world are beginning to understand it better. They are using supercomputers to simulate how the Big Bang generated the seeds that led to the formation of galaxies such as the Milky Way.

A new project involving DOE's Argonne Lab, Fermilab and Berkeley Lab will allow scientists to study this vastness in greater detail with a new cosmological simulation analysis toolbox.

Modeling the universe with a computer is very difficult, and the output of those simulations is typically very large. By anyone's standards, this is "big data," as each of these data sets can require hundreds of terabytes of storage space. Efficient storage and sharing of these huge data sets among scientists is paramount. Many different scientific analyses and processing sequences are carried out with each data set, making it impractical to rerun the simulations for each new study.

This past year Argonne Lab, Fermilab and Berkeley Lab began a unique partnership on an ambitious advanced-computing project. Together the three labs are developing a new, state-of-the-art cosmological simulation analysis toolbox that takes advantage of DOE's investments in supercomputers and specialized high-performance computing codes. Argonne's team is led by Salman Habib, principal investigator, and Ravi Madduri, system designer. Jim Kowalkowski and Richard Gerber are the team leaders at Fermilab and Berkeley Lab.

The three labs have embarked on an innovative project to develop an open platform with a web-based front end that will allow the scientific community to download, transfer, manipulate, search and record data. The system will allow scientists to upload and share applications as well as carry out complex computational analyses using the resources available to and assigned by the system.

To achieve these objectives, the team uses and enhances existing high-performance computing, high-energy physics and cosmology-specific software systems. As they modify the existing software so that it can handle the large datasets of galaxy-formation simulations, team members take advantage of the expertise they have gained by working on the big data challenges posed by particle physics experiments at the Large Hadron Collider.

This is an exciting project for the three labs. Large-scale simulations of cosmological structure formation are key discovery tools in the Cosmic Frontier program of DOE's Office of High Energy Physics. Not only will this new project provide an important toolbox for Cosmic Frontier scientists and the many institutions involved in this research, but it will also serve as a prototype for a successful big-data software project spanning many groups and communities.

The commercial world has taken notice, too. This month, Rob Roser, head of Fermilab's Scientific Computing Division, will present this as part of his keynote speech at the Big Data Conference in Chicago.

Explore further: Computer scientists win a major grant to network mobile devices in the cloud

add to favorites email to friend print save as pdf

Related Stories

Harnessing the petabyte at Rensselaer Polytechnic Institute

Sep 09, 2013

The petabyte—a quantity of digital information 12 orders of magnitude greater than the lowly kilobyte—looms large as a future standard for data. To glean knowledge from this deluge of data, a team of researchers at the ...

Recommended for you

Computerized emotion detector

2 hours ago

Face recognition software measures various parameters in a mug shot, such as the distance between the person's eyes, the height from lip to top of their nose and various other metrics and then compares it with photos of people ...

Cutting the cloud computing carbon cost

Sep 12, 2014

Cloud computing involves displacing data storage and processing from the user's computer on to remote servers. It can provide users with more storage space and computing power that they can then access from anywhere in the ...

Teaching computers the nuances of human conversation

Sep 12, 2014

Computer scientists have successfully developed programs to recognize spoken language, as in automated phone systems that respond to voice prompts and voice-activated assistants like Apple's Siri.

Mapping the connections between diverse sets of data

Sep 12, 2014

What is a map? Most often, it's a visual tool used to demonstrate the relationship between multiple places in geographic space. They're useful because you can look at one and very quickly pick up on the general ...

User comments : 0