Hindcasting helps scientists improve forecasts for life on Earth

June 14, 2012, University of California - Berkeley
Specimens like this 47-year-old honey bee from UC Berkeley's Essig Museum collection can help researchers understand how plant and animal populations have changed over the past 100 years. The pollen in the basket on the bee's hind legs is very robust over decades and can provide information about what plants were growing where the bees foraged. The bee's DNA can tell how insect populations have changed over the last 50 years. Credit: Essig Museum, UC Berkeley

Earth's changing environment and rapidly growing population are pushing plants and animals out of their native habitats, but current models that predict how this will affect the ecosystem are little more than educated guesses. And when the models have been tested, they've been wildly inaccurate.

A large and diverse group of scientists at the University of California, Berkeley, has launched a unique program, the Berkeley Initiative in (BiGCB), to improve the and accuracy of these models. The experts are employing hindcasting – "predicting" what happened during past episodes of climate change – to help them develop and test new models that will improve forecasting.

"The only way to test a and improve forecasting is through hindcasting," said Charles Marshall, director of the University of California Museum of Paleontology and a UC Berkeley professor of integrative biology. "Once we have a tested model that accurately tells us what is likely to happen to biological systems, we can construct policies to minimize unwanted impacts."

One of the leaders of BiGCB, Marshall said that the university's large museum collections – priceless records of how animals and plants adapted to past ecological change – will allow scientists to travel back in time to study how previous periods of , similar to what is now occurring, affected the biosphere. Those data can then be used to test and improve current predictive models and eventually come up with forecasting tools for policy makers and scientists alike.

A recent $1.5 million grant from the W. M. Keck Foundation will fund the development of a web-based informatics portal that will provide the framework for building the next generation of predictive models, while a new $2.5 million grant from the Gordon and Betty Moore Foundation will support seven specific projects focused on global change forecasting in California.

"These datasets are pure gold," Marshall said, referring to UC Berkeley's plant, vertebrate, insect and fossil collections.

Information from the museum collections – including the geographic distribution of specimens, their DNA and even hitchhiking pollen and parasites – provides a density of data going back hundreds and, in the case of paleontological collections, millions of years.

According to BiGCB co-leader Rosemary Gillespie, current species distribution models incorporate climate model predictions of how temperature and rainfall will change and assume that an organism will move to areas that match its preferred habitat in terms of temperature, moisture, food and more.

"You might expect that, with warmer temperatures, animals will move up mountains to keep cooler, but we know that it's more complex than that: Some animals are killed off, some adapt to the new conditions, and others move upslope," said Gillespie, a UC Berkeley professor of environmental science, policy and management and director of the campus's Essig Museum of Entomology. "We are hoping to narrow down the parameters that are important in an organism's adaptation to change, and distill those into a model that will be more reliable in predicting how biota and the associated landscape are going to change."

One big difference between the BiGCB and efforts elsewhere is the initiative's focus on a specific ecosystem and every organism in that ecosystem over time in order to develop a complete history of change at that site.

The Essig Museum, for example, contains specimens of bees, together with the pollen they were carrying and their parasites, from nearly every year since 1910. New technologies make it possible to use DNA from the historical samples to see how the honey bees, plants, pollination activities and disease have changed in the past, and from that infer how it might change in the future as a result of urbanization or agricultural land conversion.

Another BiGCB project involves drilling into Northern California's Clear Lake – one of the oldest lakes in the United States – in search of pollen that will tell how vegetation changed with altered climate as far as 130,000 years into the past. This period covers the last major climate shift in North America, the retreat of glaciers 12,000 years ago. Vegetation changes will be correlated with changes in animal populations as evidenced by fossils collected from numerous caves around Clear Lake and currently held in the Museum of Paleontology.

An anthropologist who studies California's Indians will work with pollen experts to correlate human fire use over the past 13,000 years with changes in vegetation at a coastal site near Año Nuevo and the inland area near Pinnacles National Monument.

"UC Berkeley's museums have been involved in collecting species for a long time, and many people asked, 'What use was it?'" Gillespie said. "But with rapid changes now taking place in the environment, the value of that history is roaring to the forefront. With the genomics revolution, we can exploit the collections and see genetic change in action over hundreds of years."

The informatics portal, what BiGCB scientists refer to as the "Keck engine," will combine easily searched, cloud-based databases of museum collections, such as the vertebrate-focused VertNet, with an online visualization tool created at UC Berkeley called Cal-Adapt, which displays a variety of climate change scenarios in map format. The end result could project, for example, how species' ranges will shift in relation to one another, as well as to changes in snow pack, wildfire danger and temperature through the end of the century.

"We plan to overlay the collections data on Cal-Adapt data and develop a visual interface that will allow scientists to go backward in time to hindcast as well as forecast," Gillespie said.

"The Keck engine is a really exciting tool, a transformation in how we approach issues of global change," Marshall said. "It should really alter the extent to which we can say if there is a link between global change and biotic change."

Explore further: Scientists core into California's Clear Lake to explore past climate change

Related Stories

Evidence of impending tipping point for Earth uncovered

June 6, 2012

A prestigious group of scientists from around the world is warning that population growth, widespread destruction of natural ecosystems, and climate change may be driving Earth toward an irreversible change in the biosphere, ...

Storing vertebrates in the cloud

August 24, 2011

What Google is attempting for books, the University of California, Berkeley, plans to do for the world's vertebrate specimens: store them in "the cloud."

Recommended for you

Digging deep into distinctly different DNA

January 22, 2018

A University of Queensland discovery has deepened our understanding of the genetic mutations that arise in different tissues, and how these are inherited.


Adjust slider to filter visible comments by rank

Display comments: newest first

3 / 5 (2) Jun 14, 2012
"This period covers the last major climate shift in North America, the retreat of glaciers 12,000 years ago."
I anticipate that the usual creationist trolls will pounce on this, since they "know" the world was created 4004 B.C.

But for us living in reality-oriented society, this program will be very important, even if datasets are not as "sexy" as rocket science.
5 / 5 (1) Jun 14, 2012
Captain hindsight is useful after all.
not rated yet Jun 15, 2012
The problem is that if you have a set of data, it is concievable to have an infinite number of different models that can be made to fit said data, so hindsight is a pretty dangerous thing to develop theories on.

Especially so if the data is full of random or chaotic fluctiations that don't follow a pattern. Looking at a limited sample easily leads you to believe that there is a pattern, so you formulate a model that repeats that pattern, which doesn't exist.

Even if the model follows the historical data exactly, there's no telling if it will diverge outside of the dataset until you get more data and compare.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.