Scientists develop a more efficient way to crunch climate numbers

September 25, 2014, Pacific Northwest National Laboratory
Scientists used a technique to shave time off computationally expensive global climate simulations, taking advantage of the most powerful super computers.

Mirroring the climate using ones and zeros takes a lot of computing power. Scientists at Pacific Northwest National Laboratory found a way to reduce that power hungry need dramatically with a novel computational approach. Replacing a single long computer drive with multiple short runs, they found a way to get more mileage out of the largest and fastest supercomputer systems and get the climate answers hundreds of times faster. The new strategy provides equally reliable results but at a fraction of the computational cost.

Like a sleek, modern sports car, a model has a complex computer engine running in the background. Making that engine run efficiently as possible is the goal in the race to simulate the climate. The computational cost of continues to increase at a fast rate due to the craving for higher levels of detail. Current high-resolution simulations usually take multiple days, to months to finish even on the fastest . The longer the time-span, the more robust statistics can be derived to produce a reliable signal separate from the noise that is inherent for the highly complex . In this paper, PNNL scientists showed that such a dramatic improvement in efficiency will help extend the scope and depth of detail in research investigations within a typical project lifetime.

Climate, by definition, is a statistical description of the state of the Earth's atmosphere, land and ocean over a period of time longer than a few months. The PNNL researchers calculated the statistics from a number of short simulations rather than from a single, multi-year simulation. Using the Community Atmosphere Model (CAM), they initialized the short simulations with different weather conditions, so that they were independent runs, and could be carried out simultaneously. By replacing a single long task with multiple short tasks, researchers better exploited the most powerful supercomputer systems, and answered their scientific questions much more quickly using state-of-the-art high-resolution climate models.

Currently, the research team is using the new experimentation method in a project to improve how climate process interactions are represented in the CAM5 model. They anticipate that the new strategy will help in a wide range of additional model development activities.

Explore further: Research team stays ahead of the computing curve in monumental climate modeling project

More information: Wan H, PJ Rasch, K Zhang, Y Qian, H Yan and C Zhao.  2014. "Short ensembles: An efficient method for discerning climate-relevant sensitivities in atmospheric general circulation models." Geoscientific Model Development 7: 1961-1977. DOI: 10.5194/gmd-7-1961-2014.

Related Stories

Taming uncertainty in climate prediction

March 23, 2012

( -- Uncertainty just became more certain. Atmospheric and computational researchers at Pacific Northwest National Laboratory used a new scientific approach called "uncertainty quantification," or UQ, that allowed ...

Researchers reconstruct Pacific storm track in climate model

April 18, 2014

( —The first study that combines different scales—cloud-sized and earth-sized—in one model to simulate the effects of Asian pollution on the Pacific storm track shows that Asian pollution can influence weather ...

Recommended for you

Arctic wintertime sea ice extent is among lowest on record

March 23, 2018

Sea ice in the Arctic grew to its annual maximum extent last week, and joined 2015, 2016 and 2017 as the four lowest maximum extents on record, according to scientists at the NASA-supported National Snow and Ice Data Center ...

Germany was covered by glaciers 450,000 years ago

March 23, 2018

The timing of the Middle Pleistocene glacial-interglacial cycles and the feedback mechanisms between climatic shifts and earth-surface processes are still poorly understood. This is largely due to the fact that chronological ...

Wood pellets: Renewable, but not carbon neutral

March 22, 2018

A return to firewood is bad for forests and the climate. So reports William Schlesinger, President Emeritus of the Cary Institute of Ecosystem Studies, in an Insights article published today in the journal Science.

The tradeoffs inherent in earthquake early warning systems

March 22, 2018

A team of researchers with the U.S. Geological Survey and the California Institute of Technology has found that modern earthquake early warning (EEW) systems require those interpreting their messages to take into consideration ...


Adjust slider to filter visible comments by rank

Display comments: newest first

1 / 5 (2) Sep 25, 2014
Let's be clear. Statistics are most often used to lie. Statistics can be used to apparently prove anything, yet, can never prove anything. For statistics are an analysis tool, not a fundamental physical law. This "new statistical method" simply allows them to hide the real data behind another layer of analysis. Anything produced? Subject to the logical fallacies inherent to the assumptions, which will be hidden as well.
1 / 5 (2) Sep 26, 2014
equally reliable results

Great punchline.
1 / 5 (1) Sep 26, 2014
For "state-of-the-art high-resolution climate models" what is the grid size, is it still 50 km?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.