Accurate mapping of nearshore bathymetry is essential for coastal management, navigation, and environmental monitoring. Traditional bathymetric mapping methods such as sonar surveys and LiDAR are often time-consuming and costly. This paper introduces BathyFormer, a novel vision transformer- and encoder-based deep learning model designed to estimate nearshore bathymetry from high-resolution multispectral satellite imagery. This methodology involves training the BathyFormer model on a dataset comprising satellite images and corresponding bathymetric data obtained from the Continuously Updated Digital Elevation Model (CUDEM). The model learns to predict water depths by analyzing the spectral signatures and spatial patterns present in the multispectral imagery. Validation of the estimated bathymetry maps using independent hydrographic survey data produces a root mean squared error (RMSE) ranging from 0.55 to 0.73 m at depths of 2 to 5 m across three different locations within the Chesapeake Bay, which were independent of the training set. This approach shows significant promise for large-scale, cost-effective shallow water nearshore bathymetric mapping, providing a valuable tool for coastal scientists, marine planners, and environmental managers.
more »
« less
This content will become publicly available on August 1, 2026
Investigating cost-effective single-beam survey configurations for accurate river bathymetry construction
River bathymetry is needed to accurately simulate river hydrodynamics. River bathymetric data are typically collected through boat-mounted single- or multi-beam echosounder surveys. Detailed bathymetric data from multibeam surveys may exceed the requirements of standard river hydraulic models (1D and 2D). Compared to data-intensive but expensive multibeam surveys, single-beam surveys are cost-effective. Single-beam surveys can sufficiently inform river simulations when coupled with specific preprocessing and interpolation techniques. This study contrasts two survey patterns, including the commonly used but under-studied zigzag surveys, against the traditional cross-sectional surveys. Linear and anisotropic Kriging interpolations, two widely used methods, are applied to construct bathymetry mesh from different survey configurations. Results from this study highlight efficient survey configurations for both cross-sectional and zigzag patterns, balancing accuracy and cost. Notably, zigzag surveys approach the efficacy of cross-sectional surveys when spaced below a certain threshold, but Kriging interpolation shows diminished performance with sparse zigzag surveys. The findings from this study bridge gaps in previous research by offering nuanced comparisons between survey configurations and interpolations. This study offers a comparative analysis to guide more effective planning and utilization of single-beam surveys, without advocating for specific survey patterns or interpolation techniques.
more »
« less
- Award ID(s):
- 1948938
- PAR ID:
- 10621697
- Publisher / Repository:
- Elsevier Journals
- Date Published:
- Journal Name:
- Geomorphology
- Volume:
- 482
- Issue:
- C
- ISSN:
- 0169-555X
- Page Range / eLocation ID:
- 109803
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Several studies have focused on the importance of river bathymetry (channel geometry) in hydrodynamic routing along individual reaches. However, its effect on other watershed processes such as infiltration and surface water (SW)‐groundwater (GW) interactions has not been explored across large river networks. Surface and sbsurface processes are interdependent, therefore, errors due to inaccurate representation of one watershed process can cascade across other hydraulic or hydrologic processes. This study hypothesizes that accurate bathymetric representation is not only essential for simulating channel hydrodynamics but also affects subsurface processes by impacting SW‐GW interactions. Moreover, quantifying the effect of bathymetry on surface and subsurface hydrological processes across a river network can facilitate an improved understanding of how bathymetric characteristics affect these processes across large spatial domains. The study tests this hypothesis by developing physically based distributed models capable of bidirectional coupling (SW‐GW) with four configurations with progressively reduced levels of bathymetric representation. A comparison of hydrologic and hydrodynamic outputs shows that changes in channel geometry across the four configurations has a considerable effect on infiltration, lateral seepage, and location of water table across the entire river network. For example, when using bathymetry with inaccurate channel conveyance capacity but accurate channel depth, peak lateral seepage rate exhibited 58% error. The results from this study provide insights into the level of bathymetric detail required for accurately simulating flooding‐related physical processes while also highlighting potential issues with ignoring bathymetry across lower order streams such as spurious backwater flow, inaccurate water table elevations, and incorrect inundation extents.more » « less
-
Spatial interpolation techniques play an important role in hydrology, as many point observations need to be interpolated to create continuous surfaces. Despite the availability of several tools and methods for interpolating data, not all of them work consistently for hydrologic applications. One of the techniques, the Laplace Equation, which is used in hydrology for creating flownets, has rarely been used for data interpolation. The objective of this study is to examine the efficiency of Laplace formulation (LF) in interpolating data used in hydrologic applications (hydrologic data) and compare it with other widely used methods such as inverse distance weighting (IDW), natural neighbor, and ordinary kriging. The performance of LF interpolation with other methods is evaluated using quantitative measures, including root mean squared error (RMSE) and coefficient of determination (R2) for accuracy, visual assessment for surface quality, and computational cost for operational efficiency and speed. Data related to surface elevation, river bathymetry, precipitation, temperature, and soil moisture are used for different areas in the United States. RMSE and R2 results show that LF is comparable to other methods for accuracy. LF is easy to use as it requires fewer input parameters compared to inverse distance weighting (IDW) and Kriging. Computationally, LF is faster than other methods in terms of speed when the datasets are not large. Overall, LF offers a robust alternative to existing methods for interpolating various hydrologic data. Further work is required to improve its computational efficiency.more » « less
-
null (Ed.)Abstract Subglacial topography is an important feature in numerous ice-sheet analyses and can drive the routing of water at the bed. Bed topography is primarily measured with ice-penetrating radar. Significant gaps, however, remain in data coverage that require interpolation. Topographic interpolations are typically made with kriging, as well as with mass conservation, where ice flow dynamics are used to constrain bed geometry. However, these techniques generate bed topography that is unrealistically smooth at small scales, which biases subglacial water flowpath models and makes it difficult to rigorously quantify uncertainty in subglacial drainage patterns. To address this challenge, we adapt a geostatistical simulation method with probabilistic modeling to stochastically simulate bed topography such that the interpolated topography retains the spatial statistics of the ice-penetrating radar data. We use this method to simulate subglacial topography using mass conservation topography as a secondary constraint. We apply a water routing model to each of these realizations. Our results show that many of the flowpaths significantly change with each topographic realization, demonstrating that geostatistical simulation can be useful for assessing confidence in subglacial flowpaths.more » « less
-
Abstract Gridded monthly rainfall estimates can be used for a number of research applications, including hydrologic modeling and weather forecasting. Automated interpolation algorithms, such as the “autoKrige” function in R, can produce gridded rainfall estimates that validate well but produce unrealistic spatial patterns. In this work, an optimized geostatistical kriging approach is used to interpolate relative rainfall anomalies, which are then combined with long-term means to develop the gridded estimates. The optimization consists of the following: 1) determining the most appropriate offset (constant) to use when log-transforming data; 2) eliminating poor quality data prior to interpolation; 3) detecting erroneous maps using a machine learning algorithm; and 4) selecting the most appropriate parameterization scheme for fitting the model used in the interpolation. Results of this effort include a 30-yr (1990–2019), high-resolution (250-m) gridded monthly rainfall time series for the state of Hawai‘i. Leave-one-out cross validation (LOOCV) is performed using an extensive network of 622 observation stations. LOOCV results are in good agreement with observations (R2= 0.78; MAE = 55 mm month−1; 1.4%); however, predictions can underestimate high rainfall observations (bias = 34 mm month−1; −1%) due to a well-known smoothing effect that occurs with kriging. This research highlights the fact that validation statistics should not be the sole source of error assessment and that default parameterizations for automated interpolation may need to be modified to produce realistic gridded rainfall surfaces. Data products can be accessed through the Hawai‘i Data Climate Portal (HCDP;http://www.hawaii.edu/climate-data-portal). Significance StatementA new method is developed to map rainfall in Hawai‘i using an optimized geostatistical kriging approach. A machine learning technique is used to detect erroneous rainfall maps and several conditions are implemented to select the optimal parameterization scheme for fitting the model used in the kriging interpolation. A key finding is that optimization of the interpolation approach is necessary because maps may validate well but have unrealistic spatial patterns. This approach demonstrates how, with a moderate amount of data, a low-level machine learning algorithm can be trained to evaluate and classify an unrealistic map output.more » « less
An official website of the United States government
