Data-Taking Dress Rehearsal Proves World’s Largest Computing Grid is Ready for LHC Restart

July 1, 2009
CERN Grid Computing Center

( -- The world’s largest computing grid has passed its most comprehensive tests to date in anticipation of the restart of the world’s most powerful particle accelerator, the Large Hadron Collider (LHC). The successful dress rehearsal proves that the Worldwide LHC Computing Grid (WLCG) is ready to analyze and manage real data from the massive machine. The United States is a vital partner in the development and operation of the WLCG, with 15 universities and three U.S. Department of Energy (DOE) national laboratories from 11 states contributing to the project.

The full-scale test, collectively called the Scale Test of the Experimental Program 2009 (STEP09), demonstrates the ability of the WLCG to efficiently navigate data collected from the LHC’s intense collisions at CERN, in Geneva, Switzerland, all the way through a multi-layered management process that culminates at laboratories and universities around the world. When the LHC resumes operations this fall, the WLCG will handle more than 15 million gigabytes of data every year.

Although there have been several large-scale WLCG data-processing tests in the past, STEP09, which was completed on June 15, was the first to simultaneously test all of the key elements of the process.

“Unlike previous challenges, which were dedicated testing periods, STEP09 was a production activity that closely matches the types of workload that we can expect during LHC data taking. It was a demonstration not only of the readiness of experiments, sites and services but also the operations and support procedures and infrastructures,” said CERN’s Ian Bird, leader of the WLCG project.

Once LHC data have been collected at CERN, dedicated optical fiber networks distribute the data to 11 major “Tier-1” computer centers in Europe, North America and Asia, including those at DOE’s Brookhaven National Laboratory in New York and Fermi National Accelerator Laboratory in Illinois. From these, data are dispatched to more than 140 “Tier-2” centers around the world, including 12 in the United States. It will be at the Tier-2 and Tier-3 centers that physicists will analyze data from the LHC experiments - ATLAS, CMS, ALICE and LHCb - leading to new discoveries. Support for the Tier-2 and Tier-3 centers is provided by the DOE Office of Science and the National Science Foundation.

“In order to really prove our readiness at close-to-real-life circumstances, we have to carry out data replication, data reprocessing, data analysis, and event simulation all at the same time and all at the expected scale for data taking,” said Michael Ernst, director of Brookhaven National Laboratory’s Tier-1 Computing Center. “That’s what made STEP09 unique.”

The result was “wildly successful,” Ernst said, adding that the U.S. distributed computing facility for the ATLAS experiment completed 150,000 analysis jobs at an efficiency of 94 percent.

A key goal of the test was gauging the analysis capabilities of the Tier 2 and Tier 3 computing centers. During STEP09’s 13-day run, seven U.S. Tier 2 centers for the CMS experiment, and four U.S. CMS Tier 3 centers, performed around 225,000 successful analysis jobs.

“We knew from past tests that we wanted to improve certain areas," said Oliver Gutsche, the Fermilab physicist who led the effort for the CMS experiment. "This test was especially useful because we learned how the infrastructure behaves under heavy load from all four LHC experiments. We now know that we are ready for collisions."

U.S. contributions to the WLCG are coordinated through the Open Science Grid (OSG), a national computing infrastructure for science. OSG not only contributes computing power for LHC data needs, but also for projects in many other scientific fields including biology, nanotechnology, medicine and climate science.

"This is another significant step to demonstrating that shared infrastructures can be used by multiple high-throughput science communities simultaneously," said Ruth Pordes, executive director of the Open Science Grid Consortium. "ATLAS and CMS are not only proving the usability of OSG, but contributing to maturing national distributed facilities in the U.S. for other sciences."

Physicists in the U.S. and around the world will sift through the LHC data in search of tiny signals that will lead to discoveries about the nature of the physical universe. Through their distributed computing infrastructures, these physicists also help other scientific researchers increase their use of computing and storage for broader discovery.

Provided by Brookhaven National Laboratory

Explore further: Fermilab and Caltech successfully used UltraScience Net, achieved 7 Gigabits per second

Related Stories

Open Science Grid receives $30 million award

September 25, 2006

Scientists on the track to discovery got good news this month when a powerful computing tool received critical government funding. A five-year, $30 million award to the Open Science Grid Consortium, announced by the National ...

Recommended for you

Understanding nature's patterns with plasmas

August 23, 2016

Patterns abound in nature, from zebra stripes and leopard spots to honeycombs and bands of clouds. Somehow, these patterns form and organize all by themselves. To better understand how, researchers have now created a new ...

Measuring tiny forces with light

August 25, 2016

Photons are bizarre: They have no mass, but they do have momentum. And that allows researchers to do counterintuitive things with photons, such as using light to push matter around.

Light and matter merge in quantum coupling

August 22, 2016

Where light and matter intersect, the world illuminates. Where light and matter interact so strongly that they become one, they illuminate a world of new physics, according to Rice University scientists.

Stretchy supercapacitors power wearable electronics

August 23, 2016

A future of soft robots that wash your dishes or smart T-shirts that power your cell phone may depend on the development of stretchy power sources. But traditional batteries are thick and rigid—not ideal properties for ...

Spherical tokamak as model for next steps in fusion energy

August 24, 2016

Among the top puzzles in the development of fusion energy is the best shape for the magnetic facility—or "bottle"—that will provide the next steps in the development of fusion reactors. Leading candidates include spherical ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.