Body

As Cities Faces Climate Change, Competing Models For Flood Warnings Emerge

RESEARCH :  Sep. 3, 2015
ANDREW KEATTS

Andrew Keatts | September 3, 2015Experts say existing systems often don’t provide enough detailed information to be useful during emergencies.

Image of flooding in Houston, Texas

Experts say existing systems often don’t provide enough detailed information to be useful during emergencies.

For cities, today’s flood warnings often aren’t all that helpful.

Although flooding can be extremely localized, flood warnings are often too broad. A flood warning in Harris County, for instance – with an area of more than 1,700 square miles – doesn’t say much about the specific dangers faced at the neighborhood or even block level, even if that’s how the impacts are most acutely felt.

National experts are working to fix that. In fact, there are two distinct approaches to solving that problem, and both are making progress.

Both projects aim to develop much more specific forecasts. For forecasting purposes, they essentially divide the country into many small pieces, each about 1 square kilometer in area – vastly smaller than the size of a county.

Flooding experts have long clamored for more precise alerts. In a May interview Phil Bedient, director of the Center for Severe Storm Prediction, Education and Evacuation from Disasters, told Urban Edge:

“The problem is the public gets flood alerts for the whole county. That doesn’t tell them anything. It’s a bunch of numbers. Nobody understands what 6 inches of rainfall in 3 hours means. You need a simplified system that’s communicating to the public whether their area is in trouble.

People get all these county-wide alerts, and they quit paying attention. Harris County is 1,700 square miles. We’re giving people the same alerts in Tomball that we’re giving people in Clear Lake 50 miles away. I don’t want that kind of prediction.”

More data, more sites

One approach comes from the federal government, which through its National Oceanic and Atmospheric Administration (NOAA) in May opened the National Water Center in Tuscaloosa, Alabama.

One of the organization’s primary goals is to provide far more detailed forecasts on the location and severity of flooding threats.

Right now, NOAA attempts to predict flooding based on data collected from 4,000 distinct locations around the country, according to Ed Clark, a hydrologist from NOAA’s National Weather Service. In an area like Houston, for instance, there might be a dozen sites collecting information on things like rainfall and wind speeds that influence its models.

Under the National Weather Service’s new model, that number will increase dramatically to 2.7 million network locations. In some cases, that means adding places into the models when they weren’t included before. In others, it means adding new meters and new data into the models altogether.

“That’s a 600-fold factor in the number of locations from which we have predictive water information,” Clark said in an interview. “In an urban environment, it comes down to providing water-resource intelligence for better predictive analytics.”

But the program’s success, Clark said, will require leveraging partnerships with other agencies. The National Water Center itself will have on-site staff members from federal agencies like the U.S. Geological Survey and FEMA. The U.S. Army Corps of Engineers will also be involved.

Protecting communities

It’ll also require significant collaboration from the municipalities that ultimately need to rely on the information. Cities will be able to use data from the network in one to three years, he said. But it could take about 18 years for them to fully realize the dramatic adjustments to forecasting they envision.

“We don’t expect other entities to go at it alone,” Clark said. “We expect this to be a partnership.”

He said the new approach could soon provide a high-resolution look – meaning, a more location-specific analysis – of drought conditions, just as it would with flood risks. It could similarly give the weather center a more specific sense of snow-pack levels, and therefore water resources available, in a given area. That’s a particularly critical issue today as the western U.S. struggles with drought.

Eventually, all that extra data will be able to influence decision-making. “If it floods, where are the infrastructure impacts, where should roads be closed, where does FEMA need to respond?” Clark said.

A rival approach

Meanwhile, the University of Oklahoma’s Hydrometeorology and Remote Sensing Laboratory is also working with NOAA on a slightly different approach to solving the same problem.

“We are building what I guess you could call a competing idea in terms of which implementation will win,” said Zac Flamig, a graduate research assistant and doctoral candidate with the school.

The University of Oklahoma’s model involves fewer variables than what’s envisioned by the National Water Center’s approach.

Under the historical standards, flash flood warnings worked backwards. The models focus on a given an area and determine how much rain needs to fall in a period to produce flooding. Then they issue warnings based on the likelihood that’ll happen.

“We go about it the other way,” Flamig said. “We say, ‘How much rain is falling, now what will happen?’”

The new system is an improvement, Flamig said, because it’s based on actual data from how much precipitation is falling from clouds at a given point. Additionally, it gives forecasters a better chance of identifying when flash flooding will occur downstream from an area with heavy rainfall, rather than just in areas where the rain is falling.

The Oklahoma center already publishes flash-flood forecasts for the entire country every 15 minutes. Officials there would like to improve their frequency to every 2 minutes. The federal system we rely on today system for flash flood forecasting updates every 6 to 12 hours, Flamig said.

“The problem with flash-flood warnings right now is, you get over-warnings,” Flamig said. “There was a case in Boston a year ago where the weather service issued a flash flood warning at 3 am. It woke everyone up, and maybe there was going to be a flood, but it’s not going to effect anyone in high-rises, for instance. There’s a sense we can do better, to get better responses and narrow down the area you want a warning for in space and time.”

Future plans

In the near term, Flamig said it’s likely that both approaches will be implemented and serve a role, since they solve slightly different problems, though with some overlap.

Flamig speculated that his system could be up and running in just a few short years while the National Weather Center may take longer to hone its models.

“Their technique requires a lot of computational problems,” Flamig said of the National Weather Center.

Clark, likewise, acknowledged the enormity of the data load for the center’s new model. It would generate 4 terabytes of information per day.

“In long term their solution will probably win out, but that might be two decades down the road,” Flamig said.

Andrew Keatts
Body
Body

Subscribe

Mailing Address

6100 Main St. MS-208
Houston, TX 77005-1892

kinder@rice.edu
713-348-4132 

Subscribe to our e-newsletter

Physical Address

Rice University
Kraft Hall
6100 Main Street, Suite 305
Houston, TX 77005-1892

Featured Sponsor

Support the Kinder Institute