skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 2049687

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Modern integrated hydrologic models (IHMs) are powerful tools for investigating coupled hydrologic system dynamics. The tradeoff for this realism is a high computational burden and large numbers of parameters in each cell, few of which can be specified with a high degree of confidence. These factors combined make uncertainty quantification (UQ) a problem for IHM‐based simulations, yet without rigorous UQ, it is not clear how much confidence can be placed on conclusions made with IHMs. Previous work evaluated steady‐state cases where the permeability field was a random variable, and a logical continuation is to consider transient conditions. This work assesses the confidence of an IHM representation of a first‐order basin in central Idaho, USA, using an ensemble of 250 permeability realizations under three different recharge forcing signals. The results show that surface water is simulated with high confidence across all the permeability realizations, but the groundwater system and changes to it have lower confidence. However, uncertainty in changes to the groundwater system decrease with time since an increase in the recharge, meaning that the farther one gets from a “peak” in the flow (of any size) the more confident one can be in the response (i.e., smaller inter‐quartile range). The ensemble was also used to assess how many realizations were needed to capture expected behaviors of the ensemble and their range of variability. Unsurprisingly, groundwater requires larger ensembles than surface flows, but the size of the ensembles necessary for convergence were smaller than initially expected. 
    more » « less
  2. Abstract Coupled simulations of surface and variably saturated subsurface flow, termed integrated hydrologic models (IHMs), can provide powerful insights into the complex dynamics of watersheds. The system of governing equations solved by an IHM is non‐linear, making them a significant computational burden and challenging to accurately parameterize. Consequently, a large fraction of the IHM studies to date have been “numerical hypothesis testing” studies, but, as parallel computing continues to improve, IHMs are approaching the point where they might also be useful as predictive tools. For this to become reality, the predictive uncertainty of such highly parameterized simulations must be considered. However, uncertainty is seldom considered in the IHM literature, likely due to the long runtimes of the complex simulations. The questions considered herein are how much uncertainty is there in an IHM for a common watershed simulation scenario, and how likely is it that any one realization of a system will give the same relative change as any other due to a perturbation in recharge? A stochastic ensemble of 250 permeability field realizations was used to show that uncertainty in a high‐mountain headwaters systems is dominated by the subsurface. Recharge perturbation scenarios echo these results, but the uncertainty ofchangesin streamflow or groundwater pressure heads were significantly smaller than the uncertainty in their base‐case values. The main finding is that IHMs do provide confident, predictive estimates ofrelativechanges in watersheds, even when uncertainty in specific simulation outputs may be high. 
    more » « less
  3. This compressed tarball archive contains the datasets and scripts necessary to visualize the residence time distributions, travel time distributions, and storage selection functions for the Fourth of July Creek transient simulations. The scripts and datasets are formatted as Matlab m-file scripts and MAT archives. 
    more » « less
  4. {"Abstract":["The compressed tarball archive contains the simulation files for the Fourth of July Creek basin in the White Clouds mountains of central Idah0, USA under wet, normal, and dry flow conditions using reconstructed, transient inputs. This archive builds on the "Fourth of July Creek Ensemble Simulation Outputs" (doi: 10.7273/000004796) by adding consideration of fully transient conditions and the associated uncertainty. Combined with that archive, all the files necessary to reproduce the runs are provided here. The runs are computationally demanding so this archive also contains the minimally processed datasets of the large outputs including pressure heads and/or streamflow at the monitoring locations and streamflow along the main stem of the creek. Contained in the "Output" folder is the dataset (saved as a Matlab data object) and scripts to plot the responses at monitoring locations or along the main stem."]} 
    more » « less
  5. {"Abstract":["This compressed tarball archive contains the Matlab datasets of the Base-case (denoted SR for "Standard run"), wet case (WR) and dry case (DR) for the transient residence time and travel time distributions and a Matlab script that plots them for the Zonal conceptual model or any of the 250 realizations in the stochastic ensemble. The details of the flow simulations that led to these distributions can be found by searching for "Fourth of July Creek" in the research archive and also through Dr. Engdahl's Google Scholar page."]} 
    more » « less
  6. These are the simulation outputs for an ensemble simulation of the expected response of the Fourth of July Basin in the White Clouds Mountains of central Idaho. The three files represent a base-case recharge scenario and two perturbations of +/-10%. All files inside the compressed tarball archives are ParFlow binary, a description of which is available at https://parflow.readthedocs.io/en/latest/index.html and https://github.com/parflow/parflow. Each realization in the ensemble simulation contains the steady-state pressure field, velocity field components, and the associated permeability field. 
    more » « less
  7. Modern hydrologic models have extraordinary capabilities for representing complex process in surface-subsurface systems. These capabilities have revolutionized the way we conceptualize flow systems, but how to represent uncertainty in simulated flow systems is not as well developed. Currently, characterizing model uncertainty can be computationally expensive, in part, because the techniques are appended to the numerical methods rather than seamlessly integrated. The next generation of computers, however, presents opportunities to reformulate the modeling problem so that the uncertainty components are handled more directly within the flow system simulation. Misconceptions about quantum computing abound and they will not be a “silver bullet” for solving all complex problems, but they might be leveraged for certain kinds of highly uncertain problems, such as groundwater (GW). The point of this issue paper is that the GW community could try to revise the foundations of our models so that the governing equations being solved are tailored specifically for quantum computers. The goal moving forward should not just be to accelerate the models we have, but also to address their deficiencies. Embedding uncertainty into the models by evolving distribution functions will make predictive GW modeling more complicated, but doing so places the problem into a complexity class that is highly efficient on quantum computing hardware. Next generation GW models could put uncertainty into the problem at the very beginning of a simulation and leave it there throughout, providing a completely new way of simulating subsurface flows. 
    more » « less
  8. Abstract. Lagrangian particle tracking schemes allow a wide range of flow and transport processes to be simulated accurately, but a major challenge is numerically implementing the inter-particle interactions in an efficient manner. This article develops a multi-dimensional, parallelized domain decomposition (DDC) strategy for mass-transfer particle tracking (MTPT) methods in which particles exchange mass dynamically. We show that this can be efficiently parallelized by employing large numbers of CPU cores to accelerate run times. In order to validate the approach and our theoretical predictions we focus our efforts on a well-known benchmark problem with pure diffusion, where analytical solutions in any number of dimensions are well established. In this work, we investigate different procedures for “tiling” the domain in two and three dimensions (2-D and 3-D), as this type of formal DDC construction is currently limited to 1-D. An optimal tiling is prescribed based on physical problem parameters and the number of available CPU cores, as each tiling provides distinct results in both accuracy and run time. We further extend the most efficient technique to 3-D for comparison, leading to an analytical discussion of the effect of dimensionality on strategies for implementing DDC schemes. Increasing computational resources (cores) within the DDC method produces a trade-off between inter-node communication and on-node work.For an optimally subdivided diffusion problem, the 2-D parallelized algorithm achieves nearly perfect linear speedup in comparison with the serial run-up to around 2700 cores, reducing a 5 h simulation to 8 s, while the 3-D algorithm maintains appreciable speedup up to 1700 cores. 
    more » « less