Massive computing effort to evaluate national hydrological models

(PhysOrg.com) -- A team of Penn State civil engineers has received one of the largest single-year allocations of supercomputing hours made for 2010.

The engineers, led by Patrick Reed, associate professor of civil engineering, recently received 6 million hours on a large-scale supercomputing system for their project titled "Massively Parallel Simulation and Evaluation of Hydrologic Monitoring, Prediction and Management Systems Under Uncertainty."

According to Reed, researchers can routinely use thousands of hours of computational time on supercomputers to solve complex problems. "Once you go up and towards a million hours, you need special access," he said.

Reed's team will utilize the Texas Advanced Computing Center's Ranger system at the University of Texas at Austin. "The six million service units represent one of the largest allocation blocks they will give," he explained. "For us to get this, we had to compete nationally."

Reed has been modeling local watersheds for a number of years and researching related aspects such as drought management, the effect of large river basins and . The 6 million computing hours will allow the research team, which includes Thorsten Wagener, associate professor of civil engineering, and Reed Maxwell, an associate professor at the Colorado School of Mines, to scale their models up to the regional and national level.

"We'll be looking at the Susquehanna River basin, but in addition to that, we have collaborations with Princeton University and the where we will have models of watersheds throughout the United States," he said.

Reed stated that the effort to evaluate and advance his simulation of a national water resource model will demand a massive amount of computing power.
"Right now, there's a lot of different kinds of hydrologic models and prediction frameworks out there. The goal of our project is to go from simple models that are very local, modeling stream flow at a single point in the river, all the way up to a national water resources model of the United States."

Much of the work will involve evaluating existing models. "How good are our data sets right now, and where can we fall in that predictive continuum to get good flood forecasts or to make long-term predictions?" he asked. "You have to account for uncertainty, which means running thousands or even more simulations to statistically really understand what a model is doing or is capable of doing. That's where the supercomputing comes in."

Reed said the National Weather Service became involved because the agency was seeking more accurate tools for flood forecasting.

"The National Weather Service wants more complex models in generating a flood forecast," he explained. Current flood forecasting models base their predictions on what's termed a "lumped model" ? essentially gathering all of the information involving a river, such as rainfall and evaporation, and using the average to create a flood prediction.

What the National Weather Service hopes to do is more accurately forecast flooding in different parts of a given river basin. "The move is away from a single-point time series to grids of time series where you start to distribute across space. Then that way, you're not just forecasting at a single point in a river basin, but at all the major outlets, all the major points of interest within a river basin simultaneously."

He said, "They want to do a broader sweep of analysis. Every six hours they do flood forecasts and it's a tremendously challenging job. But not only do they want to do more flood forecasting, but drought as well. They want to use the model's prediction not only in stream flow, but also soil moisture."

Being able to accurately model regional and national water basins will also allow scientists to better understand the impact of climate change, Reed added.

"As climate comes into focus and as land use changes, problems are moving from the large, urbanized streams up into the watershed. Now we're becoming concerned with the ecological ramifications of change in these small headwaters," he said. "Trout in Pennsylvania would be a huge example. These are streams that are extremely temperature sensitive and species that are extremely temperature sensitive, so making some predictions or having an understanding of our predictive skills up into these smaller streams that are ungauged is difficult."

Reed said, "As land use, population growth, climate — all these variables — are changing, the classic engineering approach to water resources management of 'the past will reflect the future' is not so true anymore. We have to fundamentally understand how we make these predictions, what observations we need to improve them, then improve our science and then use that science to advance our engineered solutions. These tools will create a whole new spectrum of possibilities for design engineers and scientists."

Citation: Massive computing effort to evaluate national hydrological models (2010, November 2) retrieved 18 April 2024 from https://phys.org/news/2010-11-massive-effort-national-hydrological.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Scientists test system to forecast flash floods along Colorado's front range

0 shares

Feedback to editors