Astronomers use supercomputer to explore role of dark matter in galaxy formation

Jun 26, 2012
The George and Cynthia Mitchell Spectrograph is mounted on the Harlan J. Smith Telescope at McDonald Observatory. Credit: Martin Harris/McDonald Observatory

From Earth, observers use telescopes to look and learn about the distant luminous spheres. But the telescope often isn't the only instrument used. Karl Gebhardt, professor of astrophysics at The University of Texas at Austin and one of the principal investigators for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX) project, makes revolutionary discoveries about dark matter by combining deep-space observations with the powerful Lonestar supercomputer at the Texas Advanced Computing Center (TACC).

Dark matter exerts a gravitational pull on matter in a galaxy, including stars, which orbit the center of the galaxy. Since dark matter neither emits nor absorbs light or other , it cannot be seen directly with telescopes. However, through indirect evidence, scientists estimate that dark matter constitutes 83% of the matter in the universe and 23% of the mass-energy.

This represents a significant portion of the universe. For that reason, astronomers like Gebhardt feel compelled to learn more about dark matter, its influences on the , and its effects on the structure of the cosmos.

"We believe dark matter is a new type of particle that has yet to be discovered," Gebhardt said. "In a lot of our experiments, we hone in on it, even though we don't know its nature yet."

To detect dark matter, researchers collect data on the motions of stars. This data drives simulations and provides a means of distinguishing the effects of dark matter on a galaxy.

Gebhardt works with two teams, one at the McDonald Observatory, a research unit of The University of Texas at Austin, and the other at the (NASA). The data collection involves the Mitchell Spectrograph, a 2.7-meter telescope at the McDonald Observatory, and the NASA . Based on the data he receives, Gebhardt builds computer models and maps to represent the distribution of dark matter throughout different galaxies.

Telescopes are time-travelling devices, enabling scientists to see earlier eras of the cosmos. But astronomers can't look back enough light-years to actually view the development of the early universe directly, so theoretical models and computer simulations continue to be a significant element in current research.

For a long time a discrepancy persisted between what observers and theorists found from observations of dark matter and the computational models of dark matter.

"We are trying to put that to rest by making a definitive study of how the dark matter is distributed," Gebhardt said.

Dark matter tends to lie at the edge of the galaxy, beyond the visible components of the galaxy. This means simulations to explore dark matter cannot be too localized and need to account for an almost unfathomable number of elements. About a hundred billion galaxies can be seen from the observatory. Each galaxy has of an order of ten billion stars. So there are a lot of elements to study, Gebhardt said.

The VIRUS spectrographs are contained in the curved gray "saddlebags" on the side of the telescope. They receive light through the green cables, which contain bundles of fiber-optic lines. This illustration shows the telescope without its enclosing dome. Credit: McDonald Observatory/HETDEX Collaboration

"The large number of data sets require a huge computer program that can basically mimic a galaxy," Gebhardt said. "That's why we need a supercomputer."

In 2004, Gebhardt received his first allocation on the original Lonestar supercomputer at TACC. As TACC's computational resources have grown, Gebhardt's simulations have also continued to advance. Now, his research teams include about a dozen researchers around the world.

"Before using TACC resources, I would run the data on my computer, crunching continuously, but it would take me a month just to process the data sets of one galaxy," Gebhardt said. "Now it takes about two hours."

Using Lonestar, Gebhardt creates nearly 100,000 different models of one galaxy, representing the range of possible ways stars can move throughout a galaxy.

The stars orbit the center of a galaxy, and the orbital speed remains equal among all stars, regardless of the distance from the center. Those findings led to the idea that dark matter acts as an attracting force, pulling matter toward it.

"We are learning a lot and are finding a different answer than what most theorists had predicted," Gebhardt said.

Through the simulations, Gebhardt has determined that the dark matter is more spread out at the edge of the galaxy than considered in the past.

"The total amount of dark matter is the same as previously assumed, but it is fluffier [more distributed] than we thought," Gebhardt said.

Gebhardt's research process works by trying to mimic the galaxy on the computer. He then compares the simulation to reality by using observations–from the Mitchell Spectrograph–of how the stars are moving. Next, he repeats the process 100,000 times with different simulations. From the whole set of simulations, he finally selects the one model that is the best representation of the data.

"The model that best mimics the data then determines the structure of the dark matter and how the stars orbit in the galaxy," Gebhardt explained.

Initial results of Gebhardt's research were published in the Astrophysical Journal in January 2012.

Gebhardt's studies of provide information about its fundamental properties, which may help scientists substantiate previous theories or generate new findings about the functioning of the universe.

Next year, the National Science Foundation and academic partners will deploy Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), the first major experiment to search for the evolution of dark energy, the mysterious force causing the expansion of the universe to speed up over time.

Over three years, HETDEX will collect data on at least one million galaxies that are nine billion to 11 billion light-years away, yielding the largest map of the universe ever produced. The map will allow astronomers to measure how fast the universe was expanding at different times in history. The project will use the giant Hobby-Eberly Telescope at McDonald Observatory and a set of spectrographs to map the three-dimensional positions of one million galaxies. HETDEX will generate about one petabyte (one million gigabytes) of data and require a lot of computer processing cycles.

"It will be a huge amount of data," Gebhardt said. "So we will continue to be large users of TACC allocations."

Gebhardt will carry on investigating , searching for the next discovery about the matter that exists beyond the stars.

"I want to understand how the entire universe works," Gebhardt said. "And no other field but astronomy can say that its answers are out of this world."

Explore further: Evidence for supernovas near Earth

add to favorites email to friend print save as pdf

Related Stories

Computer Finds Massive Black Hole in Nearby Galaxy

Jun 09, 2009

Astronomers Karl Gebhardt of The University of Texas at Austin and Jens Thomas of the Max Planck Institute for Extraterrestrial Physics have used new computer modeling techniques to discover that the black ...

Dark matter mystery deepens

Oct 17, 2011

( -- Like all galaxies, our Milky Way is home to a strange substance called dark matter. Dark matter is invisible, betraying its presence only through its gravitational pull. Without dark matter ...

Study sheds light on dark matter

Feb 06, 2006

British astronomers say they have, for the first time, determined some of the physical characteristics of dark matter.

Dark matter core defies explanation

Mar 02, 2012

( -- Astronomers using data from NASA's Hubble Telescope have observed what appears to be a clump of dark matter left behind from a wreck between massive clusters of galaxies. The result could ...

Recommended for you

Witnessing the early growth of a giant

16 hours ago

Astronomers have uncovered for the first time the earliest stages of a massive galaxy forming in the young Universe. The discovery was made possible through combining observations from the NASA/ESA Hubble ...

Evidence for supernovas near Earth

22 hours ago

Once every 50 years, more or less, a massive star explodes somewhere in the Milky Way. The resulting blast is terrifyingly powerful, pumping out more energy in a split second than the sun emits in a million ...

What lit up the universe?

Aug 27, 2014

New research from UCL shows we will soon uncover the origin of the ultraviolet light that bathes the cosmos, helping scientists understand how galaxies were built.

Eta Carinae: Our Neighboring Superstars

Aug 26, 2014

( —The Eta Carinae star system does not lack for superlatives. Not only does it contain one of the biggest and brightest stars in our galaxy, weighing at least 90 times the mass of the Sun, it ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

2.3 / 5 (3) Jun 26, 2012
Aw.... come on! People are using massive computers to explore their vision of a particular universe.
Astronomers get their information from looking.
3 / 5 (6) Jun 27, 2012
Aw.... come on! People are using massive computers to explore their vision of a particular universe.
Astronomers get their information from looking -rlosers

Aw... you must have missed the part where it says:

"Gebhardt's research process works by trying to mimic the galaxy on the computer. He then compares the simulation to reality by using observationsfrom the Mitchell Spectrographof how the stars are moving".
3 / 5 (4) Jun 28, 2012
You can't "explore" any physical phenomena ("real" or "presumed to be real"), via computer modeling. It is impossible. No "revolutionary discoveries" were made here.

It is indisputable that all computer models are incomplete. Since you can never achieve 100% accurate replication of 100% of the physical parameters impacting a real physical process, complex iterative models will always deviate from describing reality - and of course the level of error increases with each iteration.

Complex models can be useful, but they cannot ever be definitive.

Worse, in this case not only are the models not accurate, they're attempting to describe the behavior of a substance for which there is NO evidence - whose only "indication" of existence is a LACK, i.e. a discrepency between what is observed and what is predicted.

In other words, this study is garbage. They may think they're learning something, but they're "exploring" Fantasyland.