Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Abstract Floods cause hundreds of fatalities and billions of dollars of economic loss each year in the United States. To mitigate these damages, accurate flood prediction is needed for issuing early warnings to the public. This situation is exacerbated in larger model domains flood prediction, particularly in ungauged basins. To improve flood prediction for both gauged and ungauged basins, we propose a spatio‐temporal hierarchical model (STHM) using above‐normal flow estimation with a 10‐day window of modeled National Water Model (NWM) streamflow and a variety of catchment characteristics as input. The STHM is calibrated (1993–2008) and validated (2009–2018) in controlled, natural, and coastal basins over three broad groups, and shows significant improvement for the first two basin types. A seasonal analysis shows the most influential predictors beyond NWM streamflow reanalysis are the previous 3‐day average streamflow and the aridity index for controlled and natural basins, respectively. To evaluate the STHM in improving above‐normal streamflow in ungauged basins, 20‐fold cross‐validation is performed by leaving 5% of sites. Results show that the STHM increases predictive skill in over 50% of sites' by 0.1 Nash‐Sutcliffe efficiency (NSE) and improves over 65% of sites' streamflow prediction to an NSE > 0.67, which demonstrates that the STHM is one of the first of its kind and could be employed for flood prediction in both gauged and ungauged basins.more » « less
- 
            Abstract With an increasing number of continental‐scale hydrologic models, the ability to evaluate performance is key to understanding uncertainty and making improvements to the model(s). We hypothesize that any model, running a single set of physics, cannot be “properly” calibrated for the range of hydroclimatic diversity as seen in the contenintal United States. Here, we evaluate the NOAA National Water Model (NWM) version 2.0 historical streamflow record in over 4,200 natural and controlled basins using the Nash‐Sutcliffe Efficiency metric decomposed into relative performance, and conditional, and unconditional bias. Each of these is evaluated in the contexts of meteorologic, landscape, and anthropogenic characteristics to better understand where the model does poorly, what potentially causes the poor performance, and what similarities systemically poor performing areas share. The primary objective is to pinpoint traits in places with good/bad performance and low/high bias. NWM relative performance is higher when there is high precipitation, snow coverage (depth and fraction), and barren area. Low relative skill is associated with high potential evapotranspiration, aridity, moisture‐and‐energy phase correlation, and forest, shrubland, grassland, and imperviousness area. We see less bias in locations with high precipitation, moisture‐and‐energy phase correlation, barren, and grassland areas and more bias in areas with high aridity, snow coverage/fraction, and urbanization. The insights gained can help identify key hydrological factors underpinning NWM predictive skill; enforce the need for regionalized parameterization and modeling; and help inform heterogenous modeling systems, like the NOAA Next Generation Water Resource Modeling Framework, to enhance ongoing development and evaluation.more » « less
- 
            Abstract The growing number of flood events in urban areas is causing far‐reaching economic losses and social disruptions. To implement suitable flood risk management strategies and measures for flood prevention and mitigation, it is paramount to adequately assess economy‐wide losses associated with potential flood events. The assessment of potential flood damage in the USA relies significantly on the Federal Emergency Management Agency (FEMA) Hazus Flood Model. A primary requirement to reduce uncertainty is to empirically gather user input data. Consequently, there is a need to integrate publicly accessible business data for accurate evaluation of losses in business activities. We present a novel approach for estimating both the direct losses in business activities during a flood and the total economy‐wide impact that arise from these disruptions. First, we integrate the FEMA Hazus Flood Model with Reference Solutions business‐level data to estimate the reduction in business activities (i.e., direct losses) following a 100‐year flood event in New Hanover County, North Carolina. Second, we estimate the broader economy‐wide impact these business activity losses have on total industry output, employment, value‐added and household income using a regional computational general equilibrium model. The advantage of our approach is the easy replicability of the business loss estimation process for US cities and regions of interest, which, when used in a follow‐up economic impact study, leads to estimating the total economy‐wide losses in economic activities. Policy makers relying solely on direct physical and economic damages would, by ignoring indirect losses, underestimate the benefits of investing in flood mitigation and protection.more » « less
- 
            Abstract Estimating uncertainty in flood model predictions is important for many applications, including risk assessment and flood forecasting. We focus on uncertainty in physics‐based urban flooding models. We consider the effects of the model's complexity and uncertainty in key input parameters. The effect of rainfall intensity on the uncertainty in water depth predictions is also studied. As a test study, we choose the Interconnected Channel and Pond Routing (ICPR) model of a part of the city of Minneapolis. The uncertainty in the ICPR model's predictions of the floodwater depth is quantified in terms of the ensemble variance using the multilevel Monte Carlo (MC) simulation method. Our results show that uncertainties in the studied domain are highly localized. Model simplifications, such as disregarding the groundwater flow, lead to overly confident predictions, that is, predictions that are both less accurate and uncertain than those of the more complex model. We find that for the same number of uncertain parameters, increasing the model resolution reduces uncertainty in the model predictions (and increases the MC method's computational cost). We employ the multilevel MC method to reduce the cost of estimating uncertainty in a high‐resolution ICPR model. Finally, we use the ensemble estimates of the mean and covariance of the flood depth for real‐time flood depth forecasting using the physics‐informed Gaussian process regression method. We show that even with few measurements, the proposed framework results in a more accurate forecast than that provided by the mean prediction of the ICPR model.more » « less
- 
            Abstract A digital map of the built environment is useful for a range of economic, emergency response, and urban planning exercises such as helping find places in app driven interfaces, helping emergency managers know what locations might be impacted by a flood or fire, and helping city planners proactively identify vulnerabilities and plan for how a city is growing. Since its inception in 2004, OpenStreetMap (OSM) sets the benchmark for open geospatial data and has become a key player in the public, research, and corporate realms. Following the foundations laid by OSM, several open geospatial products describing the built environment have blossomed including the Microsoft USA building footprint layer and the OpenAddress project. Each of these products use different data collection methods ranging from public contributions to artificial intelligence, and if taken together, could provide a comprehensive description of the built environment. Yet, these projects are still siloed, and their variety makes integration and interoperability a major challenge. Here, we document an approach for merging data from these three major open building datasets and outline a workflow that is scalable to the continental United States (CONUS). We show how the results can be structured as a knowledge graph over which machine learning models are built. These models can help propagate and complete unknown quantities that can then be leveraged in disaster management.more » « less
- 
            Adams, Benjamin; Griffin, Amy L; Scheider, Simon; McKenzie, Grant (Ed.)During a natural disaster such as flooding, the failure of a single asset in the complex and interconnected web of critical urban infrastructure can trigger a cascade of failures within and across multiple systems with potentially life-threatening consequences. To help emergency management effectively and efficiently assess such failures, we design the Utility Connection Ontology Design Pattern to represent utility services and model connections within and across those services. The pattern is encoded as an OWL ontology and instantiated with utility data in a geospatial knowledge graph. We demonstrate how it facilitates reasoning to identify cascading service failures due to flooding for producing maps and other summaries for situational awareness.more » « less
- 
            Spatial interpolation techniques play an important role in hydrology, as many point observations need to be interpolated to create continuous surfaces. Despite the availability of several tools and methods for interpolating data, not all of them work consistently for hydrologic applications. One of the techniques, the Laplace Equation, which is used in hydrology for creating flownets, has rarely been used for data interpolation. The objective of this study is to examine the efficiency of Laplace formulation (LF) in interpolating data used in hydrologic applications (hydrologic data) and compare it with other widely used methods such as inverse distance weighting (IDW), natural neighbor, and ordinary kriging. The performance of LF interpolation with other methods is evaluated using quantitative measures, including root mean squared error (RMSE) and coefficient of determination (R2) for accuracy, visual assessment for surface quality, and computational cost for operational efficiency and speed. Data related to surface elevation, river bathymetry, precipitation, temperature, and soil moisture are used for different areas in the United States. RMSE and R2 results show that LF is comparable to other methods for accuracy. LF is easy to use as it requires fewer input parameters compared to inverse distance weighting (IDW) and Kriging. Computationally, LF is faster than other methods in terms of speed when the datasets are not large. Overall, LF offers a robust alternative to existing methods for interpolating various hydrologic data. Further work is required to improve its computational efficiency.more » « less
- 
            null (Ed.)While OWL and RDF are by far the most popular logic-based languages for Semantic Web Ontologies, some well-designed ontologies are only available in languages with a much richer expressivity, such as first-order logic (FOL) or the ISO standard Common Logic. This inhibits reuse of these ontologies by the wider Semantic Web Community. While converting OWL ontologies to FOL is straightforward, the reverse problem of finding the closest OWL approximation of an FOL ontology is undecidable. However, for most practical purposes, a ``good enough'' OWL approximation need not be perfect to enable wider reuse by the Semantic Web Community. This paper outlines such a conversion approach by first normalizing FOL sentences into a function-free prenex conjunctive normal (FF-PCNF) that strips away minor syntactic differences and then applying a pattern-based approach to identify common OWL axioms. It is tested on the over 2,000 FOL ontologies from the Common Logic Ontology Repository.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                     Full Text Available
                                                Full Text Available