'Motion picture' of past warming paves way for snapshots of future climate change

Jul 16, 2009
Supercomputer simulations of the Earth’s most recent natural global warming (more than 14,000 years ago) show melting ice sheets and a 15-degree Celsius temperature spike over the course of a few centuries. Extended 200 years into our future, the simulations - led by University of Wisconsin-Madison climatologists - will provide insight on climate changes in our own time. Image credit: Jamison Daniel/National Center for Computational Sciences.

(PhysOrg.com) -- By accurately modeling Earth's last major global warming -- and answering pressing questions about its causes -- scientists led by a University of Wisconsin-Madison climatologist are unraveling the intricacies of the kind of abrupt climate shifts that may occur in the future.

"We want to know what will happen in the future, especially if the climate will change abruptly," says Zhengyu Liu, a UW-Madison professor of atmospheric and oceanic sciences and director of the Center for Climatic Research in the Nelson Institute for Environmental Studies. "The problem is, you don't know if your model is right for this kind of change. The important thing is validating your model."

To do so, Liu and his colleagues run their mode back in time and match the results of the climate simulation with the physical evidence of past climate.

Starting with the last glacial maximum about 21,000 years ago, Liu's team simulated atmospheric and oceanic conditions through what scientists call the Bølling-Allerød warming, the Earth's last major temperature hike, which occurred about 14,500 years ago. The simulation fell in close agreement with conditions — temperatures, sea levels and glacial coverage — collected from fossil and geologic records.

"It's our most serious attempt to simulate this last major global warming event, and it's a validation of the model itself, as well," Liu says.

The results of the new climate modeling experiments are presented today (July 17) in the journal Science.

The group's simulations were executed on "Phoenix" and "Jaguar," a pair of Cray supercomputers at Oak Ridge National Laboratory in Oak Ridge, Tenn., and helped pin down the contributions of three environmental factors as drivers of the Bølling-Allerød warming: an increase in atmospheric carbon dioxide, the jump-start of stalled heat-moving ocean currents and a large buildup of subsurface heat in the ocean while those currents were dormant.

The climate dominoes began to fall during that period after glaciers reached their maximum coverage, blanketing most of North America, Liu explains. As glaciers melted, massive quantities of water poured into the North Atlantic, lowering the ocean salinity that helps power a major convection current that acts like a conveyor belt to carry warm tropical surface water north and cooler, heavier subsurface water south.

As a result, according to the model, ocean circulation stopped. Without warm tropical water streaming north, the North Atlantic cooled and heat backed up in southern waters. Subsequently, glacial melt slowed or stopped as well, and eventually restarted the overturning current — which had a much larger reserve of heat to haul north.

"All that stored heat is released like a volcano, and poured out over decades," Liu explains. "That warmed up Greenland and melted (arctic) sea ice."

Supercomputer simulations of the Earth's most recent natural global warming (more than 14,000 years ago) show melting ice sheets and a 15-degree Celsius temperature spike over the course of a few centuries. Extended 200 years into our future, the simulations - led by University of Wisconsin-Madison climatologists - will provide insight on climate changes in our own time. Image credit: Jamison Daniel/National Center for Computational Sciences. Image by: Jamison Daniel/National Center for Computational Sciences.

The model showed a 15-degree Celsius increase in average temperatures in Greenland and a 5-meter increase in sea level over just a few centuries, findings that squared neatly with the climate of the period as represented in the physical record.

"Being able to successfully simulate thousands of years of past climate for the first time with a comprehensive climate model is a major scientific achievement," notes Bette Otto-Bliesner, an atmospheric scientist and climate modeler at National Center for Atmospheric Research (NCAR) and co-author of the Science report. "This is an important step toward better understanding how the world's climate could change abruptly over the coming centuries with increasing melting of the ice caps."

The rate of ice melt during the Bølling-Allerød warming is still at issue, but its consequences are not, Liu says. The modelers simulated both a slow decrease in melt and a sudden end to melt run-off. In both cases, the result was a 15-degree warming.

"That happened in the past," Liu says. "The question is, in the future, if you have a and Greenland melts, will it happen again?"

Time — both actual and computing — will tell. In 2008, the group simulated about one-third of the last 21,000 years. With another 4 million processor hours to go, the simulations being conducted by the Wisconsin group will eventually run up to the present and 200 years into the future.

Traditional climate modeling approaches were limited by computer time and capabilities, Lieu explains.

"They did slides, like snapshots," Liu says. "You simulate 100 years, and then you run another 100 years, but those centuries may be 2,000 years apart (in the model). To look at abrupt change, there is no shortcut."

Using the interactions between land, water, atmosphere and ice in the Community Climate System Model developed at NCAR, the researchers have been able to create a much more detailed and closely spaced book of snapshots, "giving us more of a motion picture of the climate" over millennia, Liu said.

He stressed the importance of drawing together specialists in computing, oceanography, atmospheric science and glaciers — including John Kutzbach, a UW-Madison climate modeler, and UW-Madison doctoral student Feng He, responsible for modeling the glacial melt. All were key to attaining the detail necessary in recreating historical climate conditions, Liu says.

"All this data, it's from chemical proxies and bugs in the sediment," Liu said. "You really need a very interdisciplinary team: people on deep ocean, people on geology, people who know those bugs. It is a huge — and very successful — collaboration."

Source: University of Wisconsin-Madison (news : web)

Explore further: Scientists make strides in tsunami warning since 2004

add to favorites email to friend print save as pdf

Related Stories

Researchers confirm role of massive flood in climate change

Jan 10, 2006

Climate modelers at the Goddard Institute for Space Studies (GISS) have succeeded in reproducing the climate changes caused by a massive freshwater pulse into the North Atlantic that occurred at the beginning of the current ...

Recommended for you

Scientists make strides in tsunami warning since 2004

2 hours ago

The 2004 tsunami led to greater global cooperation and improved techniques for detecting waves that could reach faraway shores, even though scientists still cannot predict when an earthquake will strike.

Trade winds ventilate the tropical oceans

3 hours ago

Long-term observations indicate that the oxygen minimum zones in the tropical oceans have expanded in recent decades. The reason is still unknown. Now scientists at the GEOMAR Helmholtz Centre for Ocean Research ...

User comments : 8

Adjust slider to filter visible comments by rank

Display comments: newest first

defunctdiety
4 / 5 (9) Jul 16, 2009
"The important thing is validating your model."

As a scientist I take exception to how she worded this... The important thing is not finding some evidence to validate your model, it's to find that your model is validated by all the evidence. Two different things.

And if you're using the data your model is based off of to validate it, of course you will find it validated.
LuckyBrandon
3.4 / 5 (5) Jul 16, 2009
so ok, let me get this straight now, global warming is again natural?? :)

Well I definitely agree... :)
thermodynamics
3 / 5 (6) Jul 16, 2009
defunctdiety: I have to take exception to your exception. The scientific process consists of developing a hypothesis and then attempting to falsify it. To "validate" a computer model, you set the boundary conditions, start it running and find out if the result of the simulation is near that of experimentation. This is an attempt to falsify the code (by showing it is not near reality). They are validating their model by running it against the experimentally determined fossil record. If the result was not near the experimental results the code would have been falsified. Instead, it was close and was validated for that particular time frame and the set of initial conditions. They are not stopping there. Instead, they are going to let the code continue running to see if it is falsified along the line from then to now. If, at any point along the path the model deviates from the measured conditions, it is back to the drawing board for it. If, however, it does follow the climate trajectory it is validated for that time frame. However, that does not mean that it is valid for the next 200 years. That is where the software becomes predictive over an extent against which it cannot be validated (since there is no experimental evidence to test it against). However, if it has not been falsified over the earlier period, it has a much better chance of being a good estimate than a code that has not been validated. Also, I don't have any idea where you get the idea that the code was "using the data is based off of to validate it." There was nothing in the description of the code that indicated it had hard-wired parameters based on the record. In fact, the changes in salinity, density, and currents are indicative of good computer fluid dynamics, not parametric code. This seems to be the right direction for climate codes and their validation is scientifically sound.
jonnyboy
2 / 5 (5) Jul 16, 2009
thermo, what you are neglecting to consider is how many different models can be fine-tuned to match the existing data set over the last 30 years(which is all that we really have a good data set for) or even the last 100 years(which we have an approximate data set for). Just because the "model" matches the existing data doesn't many that it has ANY chance of matching future conditions until such time as it actually does match the future for an extended amount of time.
defunctdiety
5 / 5 (1) Jul 17, 2009
The scientific process consists of developing a hypothesis and then attempting to falsify it.


How is this any different then what I said: finding that evidence supports your model?

Also, I don't have any idea where you get the idea that the code was "using the data is based off of to validate it." There was nothing in the description of the code that indicated it had hard-wired parameters based on the record.


Maybe I misinterpreted her then when she describes what they are modeling,

"Bølling-Allerød warming: an increase in atmospheric carbon dioxide, the jump-start of stalled heat-moving ocean currents and a large buildup of subsurface heat in the ocean while those currents were dormant"

Which are events derived from chemical proxies as well as anthropological, plant and animal records. Then she goes on to describe the data they are comparing it against...

"All this data, it's from chemical proxies and bugs in the sediment,"

Hmmm...

My problem was mainly with how she spoke, the article doesn't really say anything at all about how they scripted the coding so that leaves it open to speculation, but the way she phrases their goal, it just sounded like they set out to find data that supports their model.
THEY
not rated yet Jul 17, 2009
Why only go back 21,000 years? To me, that is like the weatherman going backwards only two days to predict the weather tomorrow. It might work fine, but why throw out all the other valuable data? 21,000 is a long time to humanity, but not to mother nature.
PinkElephant
not rated yet Jul 17, 2009
@defunctdiety,

the inputs for their simulation were the initial conditions for the Bølling-Allerød warming, as derived from empirical data. However, to validate their outputs:

The simulation fell in close agreement with conditions %u2014 temperatures, sea levels and glacial coverage %u2014 collected from fossil and geologic records.


Which, while still empirical data, is empirical data distinct from that which established the initial conditions. It's the difference between inputting the initial numbers, vs. computationally reproducing the observed time-curves of the relevant parameters. The latter is what constitutes validation of the model.
GrayMouser
not rated yet Jul 19, 2009
The scientific process consists of developing a hypothesis and then attempting to falsify it.


How is this any different then what I said: finding that evidence supports your model?

Falsifying it says that you can determine when somebody is pulling your chain.
If you can't distinguish between a well designed experiment and faked data you can't prove the correctness (or incorrectness) of your hypothesis.

I would add another step to thermodynamics' posting. You start with an observation and then create a hypothesis to explain your observations. After that you test your hypothesis, write it up, and let everybody else take pot-shots at it.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.