Supercomputer Sets New Performance Record

June 23, 2006
IBM Blue Gene
IBM Blue Gene supercomputer

The world’s fastest supercomputer, BlueGene/L, set a new performance standard on June 22, 2006. Housed at Department of Energy's National Nuclear Security Administration (NNSA) Lawrence Livermore National Laboratory, the machine achieved a sustained performance of 207.3 trillion floating-point operations per second (teraFLOPS).

BG/L is an IBM supercomputer housed at NNSA's Lawrence Livermore National Laboratory, and is ranked as the world's fastest supercomputer by the Top500 ( It is used to conduct materials science simulations for NNSA's Advanced Simulation and Computing (ASC) program, which unites the scientific computing know-how of NNSA's Los Alamos, Sandia and Lawrence Livermore national laboratories. The computer simulation capabilities developed by the ASC program provide the nuclear weapons analysis that NNSA needs to keep the nuclear weapons stockpile safe, secure and reliable without underground nuclear testing.

"This is an important step on the path to performing predictive simulations of nuclear weapons, and these simulations are vital to ensuring the safety and reliability of our nuclear weapons stockpile. These results further confirm that BlueGene/L's architecture can scale with real-world applications. The performance of the Qbox code was made possible by the partnership with our IBM collaborators, who helped to optimize the code's performance on BG/L's 131,072 processors," said Dimitri Kusnezov, head of NNSA's ASC Program.

The performance improvement over previous efforts was due in large measure to new mathematical libraries developed by software researchers at IBM that take best advantage of BG/L's dual-core architecture.

"Today's results represent the first time in history that a scientific code has sustained a level of performance in excess of 200 teraFLOPS, breaking the former record also set on Blue Gene at Lawrence Livermore National Laboratory," said David Turek, vice president of Deep Computing at IBM. "Only through collaborative innovation such as through our partnership with the National Nuclear Security Administration and Lawrence Livermore National Laboratory can the boundaries of computing be pushed as far as they've been today. We will continue to work together, pushing the boundaries of insight and invention to advance our shared mission in ways never before possible."

Qbox is a first-principles molecular dynamics (FPMD) code, designed to predict the properties of metals under extreme conditions of temperature and pressure -- a longstanding goal for researchers in materials science and high energy-density physics. FPMD codes are used for complex simulations at the atomic level in a number of scientific areas, including metallurgy, solid-state physics, chemistry, biology and nanotechnology.

The "Q" in Qbox is for "quantum," a reference to the quantum mechanical descriptions of electrons that are the principal focus of this type of simulation code. The ability to accurately model changes to the electronic structure of atoms distinguishes FPMD codes from classical molecular dynamics codes.

The three-dimensional code run, studying how molybdenum (a transition metal) atoms behave under pressure, represents one of only a handful of "predictive science" simulations achieving this size: 1,000 molybdenum atoms. While classical molecular dynamics calculations are frequently run with billions of atoms because the interactions between the atoms are relatively easily computed, routine quantum runs, which are both very complex and accurate, have been restricted to around 50 atoms until now. The difference between 50 and a 1000 makes the difference between being able to explore new classes of chemical systems using first-principles methods, including heterogeneous environments (considering interactions between unlike molecules) and extreme chemistry (including shocks). Such a step is important to NNSA's stockpile stewardship program, and also has important implications for biological systems, including the study of proteins.

Predictive simulations allow researchers to understand how complex physical, chemical and biological systems behave over time, where it was previously only possible to get brief snapshots at a smaller scale. This capability to do predictive science is important to NNSA's national security mission, as its researchers try to understand how the materials in nuclear weapons age, particularly for those warheads that have aged beyond their intended life. Furthermore, the performance of the Qbox code, specially designed to run on large-scale platforms such as BG/L, has implications for the broader research community and will likely enable the development of new materials of interest to many industries.

"The combination of this code and this computer, both products of a partnership between ASC and IBM, has implications for the broad research community well beyond NNSA's mission of stockpile science. Such spin-off benefits often accompany focused programmatic efforts to foster technology. This was certainly true for NASA during the years of the moon landing and is true today," said Kusnezov. "Disruptive advanced architecture work for ASC leads to low-cost, but highly useful computers that benefit the nation well beyond national security."

Source: IBM

Explore further: Sled track simulates high-speed accident in B61-12 test

Related Stories

Sled track simulates high-speed accident in B61-12 test

October 5, 2016

Sandia National Laboratories has sent a mock B61-12 nuclear weapon speeding down the labs' 10,000-foot rocket sled track to slam nose-first into a steel and concrete wall in a spectacular test that mimicked a high-speed accident. ...

SNS accelerator celebrates 10 years of leading the way

October 6, 2016

The first of its kind superconducting linear particle accelerator (LINAC) built for the Spallation Neutron Source (SNS) at the Department of Energy's Oak Ridge National Laboratory is now celebrating 10 years of successful ...

Turning to the brain to reboot computing

October 3, 2016

Computation is stuck in a rut. The integrated circuits that powered the past 50 years of technological revolution are reaching their physical limits.

Recommended for you

Microsoft aims at Apple with high-end PCs, 3D software

October 26, 2016

Microsoft launched a new consumer offensive Wednesday, unveiling a high-end computer that challenges the Apple iMac along with an updated Windows operating system that showcases three-dimensional content and "mixed reality."

Making it easier to collaborate on code

October 26, 2016

Git is an open-source system with a polarizing reputation among programmers. It's a powerful tool to help developers track changes to code, but many view it as prohibitively difficult to use.

Dutch unveil giant vacuum to clean outside air

October 25, 2016

Dutch inventors Tuesday unveiled what they called the world's first giant outside air vacuum cleaner—a large purifying system intended to filter out toxic tiny particles from the atmosphere surrounding the machine.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.