skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Bolster, Diogo"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available February 1, 2026
  2. Free, publicly-accessible full text available March 1, 2026
  3. Free, publicly-accessible full text available February 3, 2026
  4. Free, publicly-accessible full text available September 1, 2025
  5. Compound flooding, the concurrence of multiple flooding mechanisms such as storm surge, heavy rainfall, and riverine flooding, poses a significant threat to coastal communities. To mitigate the impacts of compound flooding, forecasts must represent the variability of flooding drivers over a wide range of spatial scales while remaining timely. One approach to develop these forecasts is through subgrid corrections, which utilize information at smaller scales to “correct” water levels and current velocities averaged over the model scale. Recent studies have shown that subgrid models can improve both accuracy and efficiency; however, existing models are not able to account for the dynamic interactions of hydrologic and hydrodynamic drivers and their contributions to flooding along the smallest flow pathways when using a coarse resolution. Here, we have developed a solver called CoaSToRM (Coastal Subgrid Topography Research Model) with subgrid corrections to compute compound flooding in coastal systems resulting from fluvial, pluvial, tidal, and wind-driven processes. A key contribution is the model’s ability to enforce all flood drivers and use the subgrid corrections to improve the accuracy of the coarse-resolution simulation. The model is validated for Hurricane Eta 2020 in Tampa Bay, showing improved prediction accuracy with subgrid corrections at 42 locations. Subgrid models with coarse resolutions (R2 = 0.70, 0.73, 0.77 for 3-, 1.5-, 0.75-km grids) outperform standard counterparts (R2 = 0.03, 0.14, 0.26). A 3-km subgrid simulation runs roughly 50 times faster than a 0.75-km subgrid simulation, with similar accuracy. 
    more » « less
    Free, publicly-accessible full text available July 10, 2025
  6. Abstract This study investigates the impact of initial injection conditions on colloid transport and retention in porous media. Employing both uniform and flux‐weighted distributions for the initial colloid locations, the research explores diverse flow scenarios, ranging from simple Poiseuille flow to more complex geometries. The results underscore the pivotal role the injection mode plays on the shape of colloid retention profiles (RPs), particularly those that display anomalous non‐exponential decay with distance. Broadly, uniform injection yields multi‐exponential profiles, while flux‐weighted injection can lead to nonmonotonic profiles in certain conditions. The study identifies preferential flow paths as a key factor in producing nonmonotonic RPs. Notably, variations in fluid velocity, colloid size, and ionic strength affect attachment rates near the inlet but do not significantly alter the qualitative transition between multi‐exponential and nonmonotonic profiles. The study emphasizes that the chosen injection mode dictates retention profile shapes, highlighting its crucial role in porous media colloid transport. These insights provide a possible partial explanation of previously observed anomalous transport behaviors, urging consideration of injection conditions in interpretations of experiments, where they can be difficult to accurately control and measure with high precision. 
    more » « less
  7. Abstract Obtaining high-resolution maps of precipitation data can provide key insights to stakeholders to assess a sustainable access to water resources at urban scale. Mapping a non-stationary, sparse process such as precipitation at very high spatial resolution requires the interpolation of global datasets at the location where ground stations are available with statistical models able to capture complex non-Gaussian global space–time dependence structures. In this work, we propose a new approach based on capturing the spatially varying anisotropy of a latent Gaussian process via a locally deformed stochastic partial differential equation (SPDE) with a buffer allowing for a different spatial structure across land and sea. The finite volume approximation of the SPDE, coupled with integrated nested Laplace approximation ensures feasible Bayesian inference for tens of millions of observations. The simulation studies showcase the improved predictability of the proposed approach against stationary and no-buffer alternatives. The proposed approach is then used to yield high-resolution simulations of daily precipitation across the United States. 
    more » « less
  8. Abstract. Lagrangian particle tracking schemes allow a wide range of flow and transport processes to be simulated accurately, but a major challenge is numerically implementing the inter-particle interactions in an efficient manner. This article develops a multi-dimensional, parallelized domain decomposition (DDC) strategy for mass-transfer particle tracking (MTPT) methods in which particles exchange mass dynamically. We show that this can be efficiently parallelized by employing large numbers of CPU cores to accelerate run times. In order to validate the approach and our theoretical predictions we focus our efforts on a well-known benchmark problem with pure diffusion, where analytical solutions in any number of dimensions are well established. In this work, we investigate different procedures for “tiling” the domain in two and three dimensions (2-D and 3-D), as this type of formal DDC construction is currently limited to 1-D. An optimal tiling is prescribed based on physical problem parameters and the number of available CPU cores, as each tiling provides distinct results in both accuracy and run time. We further extend the most efficient technique to 3-D for comparison, leading to an analytical discussion of the effect of dimensionality on strategies for implementing DDC schemes. Increasing computational resources (cores) within the DDC method produces a trade-off between inter-node communication and on-node work.For an optimally subdivided diffusion problem, the 2-D parallelized algorithm achieves nearly perfect linear speedup in comparison with the serial run-up to around 2700 cores, reducing a 5 h simulation to 8 s, while the 3-D algorithm maintains appreciable speedup up to 1700 cores. 
    more » « less