Abstract Models of bathymetry derived from satellite radar altimetry are essential for modeling many marine processes. They are affected by uncertainties which require quantification. We propose an uncertainty model that assumes errors are caused by the lack of high‐wavenumber content within the altimetry data. The model is then applied to a tsunami hazard assessment. We build a bathymetry uncertainty model for northern Chile. Statistical properties of the altimetry‐predicted bathymetry error are obtained using multibeam data. We find that a Von Karman correlation function and a Laplacian marginal distribution can be used to define an uncertainty model based on a random field. We also propose a method for generating synthetic bathymetry samples conditional to shipboard measurements. The method is further extended to account for interpolation uncertainties, when bathymetry data resolution is finer than∼10 km. We illustrate the usefulness of the method by quantifying the bathymetry‐induced uncertainty of a tsunami hazard estimate. We demonstrate that tsunami leading wave predictions at middle/near field tide gauges and buoys are insensitive to bathymetry uncertainties in Chile. This result implies that tsunami early warning approaches can take full advantage of altimetry‐predicted bathymetry in numerical simulations. Finally, we evaluate the feasibility of modeling uncertainties in regions without multibeam data by assessing the bathymetry error statistics of 15 globally distributed regions. We find that a general Von Karman correlation and a Laplacian marginal distribution can serve as a first‐order approximation. The standard deviation of the uncertainty random field model varies regionally and is estimated from a proposed scaling law.
more »
« less
Statistical patterns of theory uncertainties
A comprehensive uncertainty estimation is vital for the precision program of the LHC. While experimental uncertainties are often described by stochastic processes and well-defined nuisance parameters, theoretical uncertainties lack such a description. We study uncertainty estimates for cross-section predictions based on scale variations across a large set of processes. We find patterns similar to a stochastic origin, with accurate uncertainties for processes mediated by the strong force, but a systematic underestimate for electroweak processes. We propose an improved scheme, based on the scale variation of reference processes, which reduces outliers in the mapping from leading order to next-to-leading-order in perturbation theory.
more »
« less
- Award ID(s):
- 2210283
- PAR ID:
- 10518407
- Publisher / Repository:
- SciPost
- Date Published:
- Journal Name:
- SciPost Physics Core
- Volume:
- 6
- Issue:
- 2
- ISSN:
- 2666-9366
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Subgrid‐scale processes, such as atmospheric gravity waves (GWs), play a pivotal role in shaping the Earth's climate but cannot be explicitly resolved in climate models due to limitations on resolution. Instead, subgrid‐scale parameterizations are used to capture their effects. Recently, machine learning (ML) has emerged as a promising approach to learn parameterizations. In this study, we explore uncertainties associated with a ML parameterization for atmospheric GWs. Focusing on the uncertainties in the training process (parametric uncertainty), we use an ensemble of neural networks to emulate an existing GW parameterization. We estimate both offline uncertainties in raw NN output and online uncertainties in climate model output, after the neural networks are coupled. We find that online parametric uncertainty contributes a significant source of uncertainty in climate model output that must be considered when introducing NN parameterizations. This uncertainty quantification provides valuable insights into the reliability and robustness of ML‐based GW parameterizations, thus advancing our understanding of their potential applications in climate modeling.more » « less
-
We propose a high-order stochastic–statistical moment closure model for efficient ensemble prediction of leading-order statistical moments and probability density functions in multiscale complex turbulent systems. The statistical moment equations are closed by a precise calibration of the high-order feedbacks using ensemble solutions of the consistent stochastic equations, suitable for modeling complex phenomena including non-Gaussian statistics and extreme events. To address challenges associated with closely coupled spatiotemporal scales in turbulent states and expensive large ensemble simulation for high-dimensional systems, we introduce efficient computational strategies using the random batch method (RBM). This approach significantly reduces the required ensemble size while accurately capturing essential high-order structures. Only a small batch of small-scale fluctuation modes is used for each time update of the samples, and exact convergence to the full model statistics is ensured through frequent resampling of the batches during time evolution. Furthermore, we develop a reduced-order model to handle systems with really high dimensions by linking the large number of small-scale fluctuation modes to ensemble samples of dominant leading modes. The effectiveness of the proposed models is validated by numerical experiments on the one-layer and two-layer Lorenz ‘96 systems, which exhibit representative chaotic features and various statistical regimes. The full and reduced-order RBM models demonstrate uniformly high skill in capturing the time evolution of crucial leading-order statistics, non-Gaussian probability distributions, while achieving significantly lower computational cost compared to direct Monte-Carlo approaches. The models provide effective tools for a wide range of real-world applications in prediction, uncertainty quantification, and data assimilation.more » « less
-
North Atlantic tropical cyclone (TC) activity under a high-emission scenario is projected using a statistical synthetic storm model coupled with nine Coupled Model Intercomparison Project Phase 6 (CMIP6) climate models. The ensemble projection shows that the annual frequency of TCs generated in the basin will decrease from 15.91 (1979-2014) to 12.16 (2075-2100), and TC activity will shift poleward and coast-ward. The mean of lifetime maximum intensity will increase from 66.50 knots to 75.04 knots. Large discrepancies in TC frequency and intensity projections are found among the nine CMIP6 climate models. The uncertainty in the projection of wind shear is the leading cause of the discrepancies in the TC climatology projection, dominating the uncertainties in the projection of thermodynamic parameters such as potential intensity and saturation deficit. The uncertainty in the projection of wind shear may be related to the different projections of horizontal gradient of vertically integrated temperature in the climate models, which can be induced by different parameterizations of physical processes including surface process, sea ice, and cloud feedback. Informed by the uncertainty analysis, a surrogate model is developed to provide the first-order estimation of TC activity in climate models based on large-scale environmental features.more » « less
-
Abstract Solidification phenomenon has been an integral part of the manufacturing processes of metals, where the quantification of stochastic variations and manufacturing uncertainties is critically important. Accurate molecular dynamics (MD) simulations of metal solidification and the resulting properties require excessive computational expenses for probabilistic stochastic analyses where thousands of random realizations are necessary. The adoption of inadequate model sizes and time scales in MD simulations leads to inaccuracies in each random realization, causing a large cumulative statistical error in the probabilistic results obtained through Monte Carlo (MC) simulations. In this work, we present a machine learning (ML) approach, as a data-driven surrogate to MD simulations, which only needs a few MD simulations. This efficient yet high-fidelity ML approach enables MC simulations for full-scale probabilistic characterization of solidified metal properties considering stochasticity in influencing factors like temperature and strain rate. Unlike conventional ML models, the proposed hybrid polynomial correlated function expansion here, being a Bayesian ML approach, is data efficient. Further, it can account for the effect of uncertainty in training data by exploiting mean and standard deviation of the MD simulations, which in principle addresses the issue of repeatability in stochastic simulations with low variance. Stochastic numerical results for solidified aluminum are presented here based on complete probabilistic uncertainty quantification of mechanical properties like Young’s modulus, yield strength and ultimate strength, illustrating that the proposed error-inclusive data-driven framework can reasonably predict the properties with a significant level of computational efficiency.more » « less
An official website of the United States government

