Data-Taking Dress Rehearsal Proves World’s Largest Computing Grid is Ready for LHC Restart

July 1, 2009,

CERN Grid Computing Center
( -- The world’s largest computing grid has passed its most comprehensive tests to date in anticipation of the restart of the world’s most powerful particle accelerator, the Large Hadron Collider (LHC). The successful dress rehearsal proves that the Worldwide LHC Computing Grid (WLCG) is ready to analyze and manage real data from the massive machine. The United States is a vital partner in the development and operation of the WLCG, with 15 universities and three U.S. Department of Energy (DOE) national laboratories from 11 states contributing to the project.

The full-scale test, collectively called the Scale Test of the Experimental Program 2009 (STEP09), demonstrates the ability of the WLCG to efficiently navigate data collected from the LHC’s intense collisions at CERN, in Geneva, Switzerland, all the way through a multi-layered management process that culminates at laboratories and universities around the world. When the LHC resumes operations this fall, the WLCG will handle more than 15 million gigabytes of data every year.

Although there have been several large-scale WLCG data-processing tests in the past, STEP09, which was completed on June 15, was the first to simultaneously test all of the key elements of the process.

“Unlike previous challenges, which were dedicated testing periods, STEP09 was a production activity that closely matches the types of workload that we can expect during LHC data taking. It was a demonstration not only of the readiness of experiments, sites and services but also the operations and support procedures and infrastructures,” said CERN’s Ian Bird, leader of the WLCG project.

Once LHC data have been collected at CERN, dedicated optical fiber networks distribute the data to 11 major “Tier-1” computer centers in Europe, North America and Asia, including those at DOE’s Brookhaven National Laboratory in New York and Fermi National Accelerator Laboratory in Illinois. From these, data are dispatched to more than 140 “Tier-2” centers around the world, including 12 in the United States. It will be at the Tier-2 and Tier-3 centers that physicists will analyze data from the LHC experiments - ATLAS, CMS, ALICE and LHCb - leading to new discoveries. Support for the Tier-2 and Tier-3 centers is provided by the DOE Office of Science and the National Science Foundation.

“In order to really prove our readiness at close-to-real-life circumstances, we have to carry out data replication, data reprocessing, data analysis, and event simulation all at the same time and all at the expected scale for data taking,” said Michael Ernst, director of Brookhaven National Laboratory’s Tier-1 Computing Center. “That’s what made STEP09 unique.”

The result was “wildly successful,” Ernst said, adding that the U.S. distributed computing facility for the ATLAS experiment completed 150,000 analysis jobs at an efficiency of 94 percent.

A key goal of the test was gauging the analysis capabilities of the Tier 2 and Tier 3 computing centers. During STEP09’s 13-day run, seven U.S. Tier 2 centers for the CMS experiment, and four U.S. CMS Tier 3 centers, performed around 225,000 successful analysis jobs.

“We knew from past tests that we wanted to improve certain areas," said Oliver Gutsche, the Fermilab physicist who led the effort for the CMS experiment. "This test was especially useful because we learned how the infrastructure behaves under heavy load from all four LHC experiments. We now know that we are ready for collisions."

U.S. contributions to the WLCG are coordinated through the Open Science Grid (OSG), a national computing infrastructure for science. OSG not only contributes computing power for LHC data needs, but also for projects in many other scientific fields including biology, nanotechnology, medicine and climate science.

"This is another significant step to demonstrating that shared infrastructures can be used by multiple high-throughput science communities simultaneously," said Ruth Pordes, executive director of the Open Science Grid Consortium. "ATLAS and CMS are not only proving the usability of OSG, but contributing to maturing national distributed facilities in the U.S. for other sciences."

Physicists in the U.S. and around the world will sift through the LHC data in search of tiny signals that will lead to discoveries about the nature of the physical universe. Through their distributed computing infrastructures, these physicists also help other scientific researchers increase their use of computing and storage for broader discovery.

Provided by Brookhaven National Laboratory

Explore further: World's biggest computing grid launched

Related Stories

World's biggest computing grid launched

October 3, 2008

( -- The world’s largest computing grid is ready to tackle mankind’s biggest data challenge from the earth’s most powerful accelerator. Today, three weeks after the first particle beams were injected into ...

LHC Computing Centres Join Forces for Global Grid Challenge

April 25, 2005

Today, in a significant milestone for scientific grid computing, eight major computing centres successfully completed a challenge to sustain a continuous data flow of 600 megabytes per second (MB/s) on average for 10 days ...

What to do with 15 million gigabytes of data

November 3, 2008

When it is fully up and running, the four massive detectors on the new Large Hadron Collider (LHC) at the CERN particle-physics lab near Geneva are expected to produce up to 15 million gigabytes, aka 15 petabytes, of data ...

Open Science Grid receives $30 million award

September 25, 2006

Scientists on the track to discovery got good news this month when a powerful computing tool received critical government funding. A five-year, $30 million award to the Open Science Grid Consortium, announced by the National ...

Recommended for you

CMS gets first result using largest-ever LHC data sample

February 15, 2019

Just under three months after the final proton–proton collisions from the Large Hadron Collider (LHC)'s second run (Run 2), the CMS collaboration has submitted its first paper based on the full LHC dataset collected in ...

Gravitational waves will settle cosmic conundrum

February 14, 2019

Measurements of gravitational waves from approximately 50 binary neutron stars over the next decade will definitively resolve an intense debate about how quickly our universe is expanding, according to findings from an international ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.