skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Urban Flood Modeling: Uncertainty Quantification and Physics‐Informed Gaussian Processes Regression Forecasting
Abstract Estimating uncertainty in flood model predictions is important for many applications, including risk assessment and flood forecasting. We focus on uncertainty in physics‐based urban flooding models. We consider the effects of the model's complexity and uncertainty in key input parameters. The effect of rainfall intensity on the uncertainty in water depth predictions is also studied. As a test study, we choose the Interconnected Channel and Pond Routing (ICPR) model of a part of the city of Minneapolis. The uncertainty in the ICPR model's predictions of the floodwater depth is quantified in terms of the ensemble variance using the multilevel Monte Carlo (MC) simulation method. Our results show that uncertainties in the studied domain are highly localized. Model simplifications, such as disregarding the groundwater flow, lead to overly confident predictions, that is, predictions that are both less accurate and uncertain than those of the more complex model. We find that for the same number of uncertain parameters, increasing the model resolution reduces uncertainty in the model predictions (and increases the MC method's computational cost). We employ the multilevel MC method to reduce the cost of estimating uncertainty in a high‐resolution ICPR model. Finally, we use the ensemble estimates of the mean and covariance of the flood depth for real‐time flood depth forecasting using the physics‐informed Gaussian process regression method. We show that even with few measurements, the proposed framework results in a more accurate forecast than that provided by the mean prediction of the ICPR model.  more » « less
Award ID(s):
2033607
PAR ID:
10418969
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  ;  
Publisher / Repository:
DOI PREFIX: 10.1029
Date Published:
Journal Name:
Water Resources Research
Volume:
59
Issue:
3
ISSN:
0043-1397
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Timely, accurate, and reliable information is essential for decision-makers, emergency managers, and infrastructure operators during flood events. This study demonstrates that a proposed machine learning model,MaxFloodCast, trained on physics-based hydrodynamic simulations in Harris County, offers efficient and interpretable flood inundation depth predictions. Achieving an average$$R^2$$ R 2 of 0.949 and a Root Mean Square Error of 0.61 ft (0.19 m) on unseen data, it proves reliable in forecasting peak flood inundation depths. Validated against Hurricane Harvey and Tropical Storm Imelda,MaxFloodCastshows the potential in supporting near-time floodplain management and emergency operations. The model’s interpretability aids decision-makers in offering critical information to inform flood mitigation strategies, to prioritize areas with critical facilities and to examine how rainfall in other watersheds influences flood exposure in one area. TheMaxFloodCastmodel enables accurate and interpretable inundation depth predictions while significantly reducing computational time, thereby supporting emergency response efforts and flood risk management more effectively. 
    more » « less
  2. Abstract Atmospheric aerosols influence the Earth’s climate, primarily by affecting cloud formation and scattering visible radiation. However, aerosol-related physical processes in climate simulations are highly uncertain. Constraining these processes could help improve model-based climate predictions. We propose a scalable statistical framework for constraining the parameters of expensive climate models by comparing model outputs with observations. Using the C3.AI Suite, a cloud computing platform, we use a perturbed parameter ensemble of the UKESM1 climate model to efficiently train a surrogate model. A method for estimating a data-driven model discrepancy term is described. The strict bounds method is applied to quantify parametric uncertainty in a principled way. We demonstrate the scalability of this framework with 2 weeks’ worth of simulated aerosol optical depth data over the South Atlantic and Central African region, written from the model every 3 hr and matched in time to twice-daily MODIS satellite observations. When constraining the model using real satellite observations, we establish constraints on combinations of two model parameters using much higher time-resolution outputs from the climate model than previous studies. This result suggests that within the limits imposed by an imperfect climate model, potentially very powerful constraints may be achieved when our framework is scaled to the analysis of more observations and for longer time periods. 
    more » « less
  3. Developing suitable approximate models for analyzing and simulating complex nonlinear systems is practically important. This paper aims at exploring the skill of a rich class of nonlinear stochastic models, known as the conditional Gaussian nonlinear system (CGNS), as both a cheap surrogate model and a fast preconditioner for facilitating many computationally challenging tasks. The CGNS preserves the underlying physics to a large extent and can reproduce intermittency, extreme events, and other non-Gaussian features in many complex systems arising from practical applications. Three interrelated topics are studied. First, the closed analytic formulas of solving the conditional statistics provide an efficient and accurate data assimilation scheme. It is shown that the data assimilation skill of a suitable CGNS approximate forecast model outweighs that by applying an ensemble method even to the perfect model with strong nonlinearity, where the latter suffers from filter divergence. Second, the CGNS allows the development of a fast algorithm for simultaneously estimating the parameters and the unobserved variables with uncertainty quantification in the presence of only partial observations. Utilizing an appropriate CGNS as a preconditioner significantly reduces the computational cost in accurately estimating the parameters in the original complex system. Finally, the CGNS advances rapid and statistically accurate algorithms for computing the probability density function and sampling the trajectories of the unobserved state variables. These fast algorithms facilitate the development of an efficient and accurate data-driven method for predicting the linear response of the original system with respect to parameter perturbations based on a suitable CGNS preconditioner. 
    more » « less
  4. In this paper, we use a procedural generation system to design urban layouts that passively reduce water depth during urban floods. The tool enables designing cities that passively lower flood depth everywhere or in chosen key areas. Our approach integrates a porosity-based hydraulic model and a parameterized urban generation system with an optimization engine so as to find the least cost modification to an initial urban layout. In order to investigate the relationship between urban layout design parameters and flood inundation depth, correlation coefficient method is used. This paper concludes that the most influential urban layout parameters are average road length and the mean parcel area. 
    more » « less
  5. Abstract Estimating a patient‐specific computational model's parameters relies on data that is often unreliable and ill‐suited for a deterministic approach. We develop an optimization‐based uncertainty quantification framework for probabilistic model tuning that discovers model inputs distributions that generate target output distributions. Probabilistic sampling is performed using a surrogate model for computational efficiency, and a general distribution parameterization is used to describe each input. The approach is tested on seven patient‐specific modeling examples using CircAdapt, a cardiovascular circulatory model. Six examples are synthetic, aiming to match the output distributions generated using known reference input data distributions, while the seventh example uses real‐world patient data for the output distributions. Our results demonstrate the accurate reproduction of the target output distributions, with a correct recreation of the reference inputs for the six synthetic examples. Our proposed approach is suitable for determining the parameter distributions of patient‐specific models with uncertain data and can be used to gain insights into the sensitivity of the model parameters to the measured data. 
    more » « less