Record simulations conducted on Lawrence Livermore supercomputer

Mar 19, 2013 by Breanna Bishop
OSIRIS simulation on Sequoia of the interaction of a fast-ignition-scale laser with a dense DT plasma. The laser field is shown in green, the blue arrows illustrate the magnetic field lines at the plasma interface and the red/yellow spheres are the laser-accelerated electrons that will heat and ignite the fuel.

(Phys.org) —Researchers at Lawrence Livermore National Laboratory have performed record simulations using all 1,572,864 cores of Sequoia, the largest supercomputer in the world. Sequoia, based on IBM BlueGene/Q architecture, is the first machine to exceed one million computational cores. It also is No. 2 on the list of the world's fastest supercomputers, operating at 16.3 petaflops (16.3 quadrillion floating point operations per second).

The simulations are the largest particle-in-cell (PIC) code simulations by number of cores ever performed. PIC simulations are used extensively in to model the motion of the charged particles, and the electromagnetic interactions between them, that make up ionized matter. such as Sequoia enable these codes to follow the simultaneous evolution of tens of billions to trillions of individual particles in highly complex systems.

Frederico Fiuza, a physicist and Lawrence Fellow at LLNL, performed the simulations in order to study the interaction of ultra-powerful lasers with dense plasmas in a proposed method to produce fusion energy, the energy source that powers the sun, in a laboratory setting. The method, known as fast ignition, uses lasers capable of delivering more than a of power (a million billion watts) in a fraction of a billionth of a second to heat compressed deuterium and tritium (DT) fuel to temperatures exceeding the 50 million degrees Celsius needed to initiate fusion reactions and release net energy. The project is part of the U.S. Department of Energy's Office of Fusion Energy Science Program.

This method differs from the approach being taken by LLNL's to achieve thermonuclear ignition and burn. NIF's approach is called the "central hot spot" scenario, which relies on simultaneous compression and ignition of a spherical fuel capsule in an implosion, much like in a diesel engine. Fast ignition uses the same hardware as the hot spot approach but adds a high-intensity, ultrashort-pulse laser as the "spark" that achieves ignition.

The code used in these simulations was OSIRIS, a PIC code that has been developed over more than 10 years in collaboration between the University of California, Los Angeles and Portugal's Instituto Superior Técnico. Using this code, Fiuza demonstrated excellent scaling in parallel performance of OSIRIS to the full 1.6 million cores of Sequoia. By increasing the number of cores for a relatively small problem of fixed size, what computer scientists call "strong scaling," OSIRIS obtained 75 percent efficiency on the full machine. But when the total problem size was increased, what is called "weak scaling," a 97 percent efficiency was achieved.

"This means that a that would take an entire year to perform on a medium-size cluster of 4,000 cores can be performed in a single day. Alternatively, problems 400 times greater in size can be simulated in the same amount of time," Fiuza said. "The combination of this unique supercomputer and this highly efficient and scalable code is allowing for transformative research."

OSIRIS is routinely used for fundamental science during the test phase of Sequoia in simulations with up to 256,000 cores. These simulations are allowing researchers, for the first time, to model the interaction of realistic fast-ignition-scale lasers with dense plasmas in three dimensions with sufficient speed to explore a large parameter space and optimize the design for ignition. Each simulation evolves the dynamics of more than 100 billion particles for more than 100,000 computational time steps. This is approximately an order of magnitude larger than the previous largest simulations of fast ignition.

Sequoia is a National Nuclear Security Administration (NNSA) machine, developed and fielded as part of NNSA's Advanced Simulation and Computing (ASC) program. Sequoia is in preparations to move to classified computing in support of stockpile stewardship.

"This historic calculation is an impressive demonstration of the power of high-performance computing to advance our scientific understanding of complex systems," said Bill Goldstein, LLNL's deputy director for Science and Technology. "With simulations like this, we can help transform the outlook for laboratory fusion as a tool for science, energy and stewardship of the nuclear stockpile."

Explore further: Supercomputer for astronomy 'ATERUI' upgraded to double its speed

Related Stories

Predictive simulation successes on Dawn supercomputer

Sep 30, 2009

(PhysOrg.com) -- The 500-teraFLOPS Advanced Simulation and Computing program's Sequoia Initial Delivery System (Dawn), an IBM machine of the same lineage as BlueGene/L, has immediately proved itself useful ...

Researchers break million-core supercomputer barrier

Jan 28, 2013

Stanford Engineering's Center for Turbulence Research (CTR) has set a new record in computational science by successfully using a supercomputer with more than one million computing cores to solve a complex ...

IBM To Build Supercomputer For U.S. Government

Feb 03, 2009

(PhysOrg.com) -- The U.S. Government has contracted out IBM to build a massive supercomputer bigger than any supercomputer out there. The supercomputer system, called Sequoia, will be capable of delivering ...

Recommended for you

Shedding light on solar power

18 hours ago

Everyone wants to save energy, but not everyone knows where to start. Grid Resources, a startup based out of the Centre for Urban Energy's iCUE incubator, is developing a new website that seeks to help homeowners ...

Energy transition project moves into its second phase

19 hours ago

Siemens is studying new concepts for optimizing the cost-effectiveness and technical performance of energy systems with distributed and fluctuating electricity production. The associated IRENE research project ...

User comments : 15

Adjust slider to filter visible comments by rank

Display comments: newest first

Lurker2358
1 / 5 (6) Mar 19, 2013
Garbage in/garbage out.
Jeweller
not rated yet Mar 19, 2013
I'm sorry, but I'm going to show off my ignorance (or stupidity) again here. What does 'stewardship of the nuclear stockpile' mean ?
IronhorseA
4.2 / 5 (5) Mar 19, 2013
I'm sorry, but I'm going to show off my ignorance (or stupidity) again here. What does 'stewardship of the nuclear stockpile' mean ?

Using simulations to test if the nuclear weapons in our stockpile still work without having to set one off.
deisik
not rated yet Mar 19, 2013
What would happen to all those cores if reaction ran away?
Sean_W
1 / 5 (3) Mar 19, 2013
Deleted by author
Jeweller
not rated yet Mar 19, 2013
Oh. Thank you IronhorseA.
ValeriaT
1 / 5 (7) Mar 19, 2013
The NIF was already proven incapable to ignite the hot fusion. Such a giant waste of money... IMO the USA government is desperately trying to get the inertial fusion working for not to face competition of distributed sources of energy in form of cold fusion and magnetic motors. No centralized energy sources = no reason for centralized government. IMO they've no chance anyway, as the hot fusion is a matter of as distant future, as before fifty years. Yildiz motor trailer.
gwrede
3 / 5 (4) Mar 19, 2013
What would happen to all those cores if reaction ran away?
Nothing. Specifically, the Lawrence Livermore National Laboratory would not go up in a mushroom cloud of smoke. Those cores are not like the cores in a nuclear power plant. They are computer CPUs, just like the one in your computer.

Instead, the Sequoia computer would give out a file full of numbers, and the scientists would interpret those particular numbers as "the reaction ran away".
ValeriaT
1 / 5 (5) Mar 19, 2013
IMO the additional heating will not help the hot fusion anyway. What the hot fusion actually needs for overcoming of the Lawson criterion is the pressure. The increasing of temperature without additional pressure just makes the collisions of neutrons faster and more temporal. The collision of fast neutron with atom nuclei will result into "Newton cradle" effect instead - due the conservation of momentum another neutron will be ejected from atom nuclei at the opposite side - but the total time during which the excessive neutron will actually spend inside the atom nuclei will rather decrease during it - not increase. IMO even the hot fusion needs smarter approach, not brute force.
PhyOrgSux
1.7 / 5 (6) Mar 19, 2013
ValeriaT
Yildiz motor trailer.


Reminds me of the machine that the Australian(?) company Cycclone was building around 2005. For some reason it did not look like much came out of it although their company (one page)website is still up. Anyways I think there are a lot of false hope and scams in these.
k_m
1 / 5 (4) Mar 20, 2013
IMO the additional heating will not help the hot fusion anyway. What the hot fusion actually needs for overcoming of the Lawson criterion is the pressure. The increasing of temperature without additional pressure just makes the collisions of neutrons faster and more temporal....

P=VT does not apply at subatomic scale?
astro_optics
1 / 5 (3) Mar 20, 2013
IF the US gov is so desperate to get an alternative source of power such a "Hot Fusion", why don't they then use "Cold Fusion" and "Magneto...blah...blah" power themselves if these are such a viable options?!..
deisik
1 / 5 (1) Mar 20, 2013
Nothing. Specifically, the Lawrence Livermore National Laboratory would not go up in a mushroom cloud of smoke. Those cores are not like the cores in a nuclear power plant. They are computer CPUs, just like the one in your computer.

I just thought of something more like informational "mushroom cloud of smoke" without actual boom in the "real world"
ValeriaT
1.8 / 5 (5) Mar 20, 2013
I think there are a lot of false hope and scams in these.
Yes they undoubtedly are. In the same way, like the attempts for replications are missing. With two or three peer-reviewed publications in mainstream physics about Yildiz motor replicas it would be much easier to judge, what these technologies are actually about. But I'm pretty sure, such a replications aren't planned with mainstream physics community in foreseeable future. The opponents of Galileo knew very well, why not to look through his telescopes at the evidence of heliocentric model - and nothing actually changed with human ignorance from this time. It's principles and mechanisms didn't change from the medieval times.
gwrede
1 / 5 (2) Mar 22, 2013
I just thought of something more like informational "mushroom cloud of smoke" without actual boom in the "real world"
The simulations done for 'stewardship of the nuclear stockpile' are actually supposed to 'run away'. If they don't, then it means the nuclear weapon they were simulating, has become a dud, and should be scrapped from the arsenal.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.