Coping with floods—of water and data

December 19, 2014 by Dan Fay

Halloween 2013 brought real terror to an Austin, Texas, neighborhood, when a flash flood killed four residents and damaged roughly 1,200 homes. Following torrential rains, Onion Creek swept over its banks and inundated the surrounding community. At its peak, the rampaging water flowed at twice the force of Niagara Falls (source: USA Today).

While studying the site shortly afterwards, David Maidment, a professor of civil engineering at the University of Texas, ran into an old acquaintance, Harry Evans, chief of staff for the Austin Fire Department. Recognizing their shared interest in predicting and responding to floods, the two began collaborating on a system to bring flood forecasts and warnings down to the local level. The need was obvious: flooding claims more lives and costs more federal government money than any other category of natural disasters. A system that can predict local floods could help flood-prone communities prepare for and maybe even prevent catastrophic events like the Onion Creek deluge.

Soon, Maidment had pulled together other participants from academia, government, and industry to start the National Flood Interoperability Experiment (NFIE), with a goal of developing the next generation of flood forecasting for the United States. NFIE was designed to connect the National Flood Forecasting System with local emergency response and thereby create real-time flood information services.

The process of crunching data from the four federal agencies that deal with flooding (the US Geologic Survey, the National Weather Service, the US Army Corps of Engineers, and the Federal Emergency Management Agency) was a burden for even the best-equipped physical datacenter—but not for the almost limitless scalability of the cloud. Maidment submitted a successful proposal for a Microsoft Azure for Research Award, which provided the necessary storage and compute resources via Microsoft Azure, the company's cloud-computing platform.

Today, NFIE is using Microsoft Azure to perform the statistical analysis necessary to compare present and past data from flood-prone areas and thus build prediction models. By deploying an Azure-based solution, the NFIE researchers can see what's happening in real time and can collaborate from anywhere, sharing data from across the country. The system has also proved to be easy to learn: programmers had their computer model, RAPID (Routing Application for Parallel computation of Discharge) up and running after just two days of training on Azure. Moreover, the Azure cloud platform provides almost infinite scalability, which could be crucial as the National Weather Service is in the process of increasing its forecasts from 4,500 to 2.6 million locations. Of course, the greatest benefits of this Azure-based solution accrue to the public—to folks like those living along Onion Creek—whose property and lives might be spared by the timely prediction of floods.

Explore further: Cloud computing helps make sense of cloud forests

Related Stories

Cloud computing helps make sense of cloud forests

December 17, 2014

The forests that surround Campos do Jordao are among the foggiest places on Earth. With a canopy shrouded in mist much of time, these are the renowned cloud forests of the Brazilian state of São Paulo. It is here that researchers ...

Microsoft lapse cause outages in Azure service

February 23, 2013

Microsoft unwittingly let an online security certificate expire Friday, triggering a worldwide outage in an online service that stores data for a wide range of business customers.

Working together to promote greater resilience to flooding

September 24, 2014

Researchers from the University of Exeter are working to help communities become more resilient to natural hazards like flooding. The project is being carried out in collaboration with the Climate Outreach Information Network, ...

Recommended for you

A not-quite-random walk demystifies the algorithm

December 15, 2017

The algorithm is having a cultural moment. Originally a math and computer science term, algorithms are now used to account for everything from military drone strikes and financial market forecasts to Google search results.

US faces moment of truth on 'net neutrality'

December 14, 2017

The acrimonious battle over "net neutrality" in America comes to a head Thursday with a US agency set to vote to roll back rules enacted two years earlier aimed at preventing a "two-speed" internet.

FCC votes along party lines to end 'net neutrality' (Update)

December 14, 2017

The Federal Communications Commission repealed the Obama-era "net neutrality" rules Thursday, giving internet service providers like Verizon, Comcast and AT&T a free hand to slow or block websites and apps as they see fit ...

The wet road to fast and stable batteries

December 14, 2017

An international team of scientists—including several researchers from the U.S. Department of Energy's (DOE) Argonne National Laboratory—has discovered an anode battery material with superfast charging and stable operation ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.