Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Extreme water levels (EWLs) resulting from tropical and extratropical cyclones pose significant risks to coastal communities and their interconnected ecosystems. To date, physically-based models have enabled accurate characterization of EWLs despite their inherent high computational cost. However, the applicability of these models is limited to data-rich sites with diverse morphologic and hydrodynamic characteristics. The dependence on high quality spatiotemporal data, which is often computationally expensive, hinders the applicability of these models to regions of either limited or data-scarce conditions. To address this challenge, we present a computationally efficient deep learning framework, employing Long Short-Term Memory (LSTM) networks, to predict the evolution of EWLs beyond site-specific training stations. The framework, named LSTM-Station Approximated Models (LSTM-SAM), consists of a collection of bidirectional LSTM models enhanced with a custom attention layer mechanism embedded in the model architecture. Moreover, the LSTM-SAM framework incorporates a transfer learning approach that is applicable to target (tide-gage) stations along the U.S. Atlantic Coast. The LSTM-SAM framework demonstrates satisfactory performance with “transferable” models achieving average Kling-Gupta Efficiency (KGE), Nash-Sutcliffe Efficiency (NSE), and Root-Mean Square Error (RMSE) ranging from 0.78 to 0.92, 0.90 to 0.97, and 0.09 to 0.18 at the target stations, respectively. Following these results, the LSTM-SAM framework can accurately predict not only EWLs but also their evolution over time, i.e., onset, peak, and dissipation, which could assist in large-scale operational flood forecasting, especially in regions with limited resources to set up high fidelity physically-based models.more » « lessFree, publicly-accessible full text available June 11, 2025
-
Abstract Several studies have focused on the importance of river bathymetry (channel geometry) in hydrodynamic routing along individual reaches. However, its effect on other watershed processes such as infiltration and surface water (SW)‐groundwater (GW) interactions has not been explored across large river networks. Surface and sbsurface processes are interdependent, therefore, errors due to inaccurate representation of one watershed process can cascade across other hydraulic or hydrologic processes. This study hypothesizes that accurate bathymetric representation is not only essential for simulating channel hydrodynamics but also affects subsurface processes by impacting SW‐GW interactions. Moreover, quantifying the effect of bathymetry on surface and subsurface hydrological processes across a river network can facilitate an improved understanding of how bathymetric characteristics affect these processes across large spatial domains. The study tests this hypothesis by developing physically based distributed models capable of bidirectional coupling (SW‐GW) with four configurations with progressively reduced levels of bathymetric representation. A comparison of hydrologic and hydrodynamic outputs shows that changes in channel geometry across the four configurations has a considerable effect on infiltration, lateral seepage, and location of water table across the entire river network. For example, when using bathymetry with inaccurate channel conveyance capacity but accurate channel depth, peak lateral seepage rate exhibited 58% error. The results from this study provide insights into the level of bathymetric detail required for accurately simulating flooding‐related physical processes while also highlighting potential issues with ignoring bathymetry across lower order streams such as spurious backwater flow, inaccurate water table elevations, and incorrect inundation extents.
-
Abstract A digital map of the built environment is useful for a range of economic, emergency response, and urban planning exercises such as helping find places in app driven interfaces, helping emergency managers know what locations might be impacted by a flood or fire, and helping city planners proactively identify vulnerabilities and plan for how a city is growing. Since its inception in 2004, OpenStreetMap (OSM) sets the benchmark for open geospatial data and has become a key player in the public, research, and corporate realms. Following the foundations laid by OSM, several open geospatial products describing the built environment have blossomed including the Microsoft USA building footprint layer and the OpenAddress project. Each of these products use different data collection methods ranging from public contributions to artificial intelligence, and if taken together, could provide a comprehensive description of the built environment. Yet, these projects are still siloed, and their variety makes integration and interoperability a major challenge. Here, we document an approach for merging data from these three major open building datasets and outline a workflow that is scalable to the continental United States (CONUS). We show how the results can be structured as a knowledge graph over which machine learning models are built. These models can help propagate and complete unknown quantities that can then be leveraged in disaster management.
-
Abstract Estimating uncertainty in flood model predictions is important for many applications, including risk assessment and flood forecasting. We focus on uncertainty in physics‐based urban flooding models. We consider the effects of the model's complexity and uncertainty in key input parameters. The effect of rainfall intensity on the uncertainty in water depth predictions is also studied. As a test study, we choose the Interconnected Channel and Pond Routing (ICPR) model of a part of the city of Minneapolis. The uncertainty in the ICPR model's predictions of the floodwater depth is quantified in terms of the ensemble variance using the multilevel Monte Carlo (MC) simulation method. Our results show that uncertainties in the studied domain are highly localized. Model simplifications, such as disregarding the groundwater flow, lead to overly confident predictions, that is, predictions that are both less accurate and uncertain than those of the more complex model. We find that for the same number of uncertain parameters, increasing the model resolution reduces uncertainty in the model predictions (and increases the MC method's computational cost). We employ the multilevel MC method to reduce the cost of estimating uncertainty in a high‐resolution ICPR model. Finally, we use the ensemble estimates of the mean and covariance of the flood depth for real‐time flood depth forecasting using the physics‐informed Gaussian process regression method. We show that even with few measurements, the proposed framework results in a more accurate forecast than that provided by the mean prediction of the ICPR model.