Parameters in climate models are usually calibrated manually, exploiting only small subsets of the available data. This precludes both optimal calibration and quantification of uncertainties. Traditional Bayesian calibration methods that allow uncertainty quantification are too expensive for climate models; they are also not robust in the presence of internal climate variability. For example, Markov chain Monte Carlo (MCMC) methods typically require
- NSF-PAR ID:
- 10227741
- Date Published:
- Journal Name:
- Journal of Verification, Validation and Uncertainty Quantification
- Volume:
- 6
- Issue:
- 1
- ISSN:
- 2377-2158
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract model runs and are sensitive to internal variability noise, rendering them infeasible for climate models. Here we demonstrate an approach to model calibration and uncertainty quantification that requires only model runs and can accommodate internal climate variability. The approach consists of three stages: (a) a calibration stage uses variants of ensemble Kalman inversion to calibrate a model by minimizing mismatches between model and data statistics; (b) an emulation stage emulates the parameter‐to‐data map with Gaussian processes (GP), using the model runs in the calibration stage for training; (c) a sampling stage approximates the Bayesian posterior distributions by sampling the GP emulator with MCMC. We demonstrate the feasibility and computational efficiency of this calibrate‐emulate‐sample (CES) approach in a perfect‐model setting. Using an idealized general circulation model, we estimate parameters in a simple convection scheme from synthetic data generated with the model. The CES approach generates probability distributions of the parameters that are good approximations of the Bayesian posteriors, at a fraction of the computational cost usually required to obtain them. Sampling from this approximate posterior allows the generation of climate predictions with quantified parametric uncertainties. -
null (Ed.)Abstract Computer model calibration typically operates by fine-tuning parameter values in a computer model so that the model output faithfully predicts reality. By using performance targets in place of observed data, we show that calibration techniques can be repurposed for solving multi-objective design problems. Our approach allows us to consider all relevant sources of uncertainty as an integral part of the design process. We demonstrate our proposed approach through both simulation and fine-tuning material design settings to meet performance targets for a wind turbine blade.more » « less
-
Liquefaction under cyclic loads can be predicted through advanced (liquefaction-capable) material constitutive models. However, such constitutive models have several input parameters whose values are often unknown or imprecisely known, requiring calibration via lab/in-situ test data. This study proposes a Bayesian updating framework that integrates probabilistic calibration of the soil model and probabilistic prediction of lateral spreading due to seismic liquefaction. In particular, the framework consists of three main parts: (1) Parametric study based on global sensitivity analysis, (2) Bayesian calibration of the primary input parameters of the constitutive model, and (3) Forward uncertainty propagation through a computational model simulating the response of a soil column under earthquake loading. For demonstration, the PM4Sand model is adopted, and cyclic strength data of Ottawa F-65 sand from cyclic direct simple shear tests are utilized to calibrate the model. The three main uncertainty analyses are performed using quoFEM, a SimCenter open-source software application for uncertainty quantification and optimization in the field of natural hazard engineering. The results demonstrate the potential of the framework linked with quoFEM to perform calibration and uncertainty propagation using sophisticated simulation models that can be part of a performance-based design workflow.more » « less
-
Abstract Motivated by a computer model calibration problem from the oil and gas industry, involving the design of a honeycomb seal, we develop a new Bayesian methodology to cope with limitations in the canonical apparatus stemming from several factors. We propose a new strategy of on‐site design and surrogate modeling for a computer simulator acting on a high‐dimensional input space that, although relatively speedy, is prone to numerical instabilities, missing data, and nonstationary dynamics. Our aim is to strike a balance between data‐faithful modeling and computational tractability in a calibration framework—tailoring the computer model to a limited field experiment. Situating our
on‐site surrogates within the canonical calibration apparatus requires updates to that framework. We describe a novel yet intuitive Bayesian setup that carefully decomposes otherwise prohibitively large matrices by exploiting the sparse blockwise structure. Empirical illustrations demonstrate that this approach performs well on toy data and our motivating honeycomb example. -
Background. Published data on a disease do not always correspond directly to the parameters needed to simulate natural history. Several calibration methods have been applied to computer-based disease models to extract needed parameters that make a model’s output consistent with available data. Objective. To assess 3 calibration methods and evaluate their performance in a real-world application. Methods. We calibrated a model of cholera natural history in Bangladesh, where a lack of active surveillance biases available data. We built a cohort state-transition cholera natural history model that includes case hospitalization to reflect the passive surveillance data-generating process. We applied 3 calibration techniques: incremental mixture importance sampling, sampling importance resampling, and random search with rejection sampling. We adapted these techniques to the context of wide prior uncertainty and many degrees of freedom. We evaluated the resulting posterior parameter distributions using a range of metrics and compared predicted cholera burden estimates. Results. All 3 calibration techniques produced posterior distributions with a higher likelihood and better fit to calibration targets as compared with prior distributions. Incremental mixture importance sampling resulted in the highest likelihood and largest number of unique parameter sets to better inform joint parameter uncertainty. Compared with naïve uncalibrated parameter sets, calibrated models of cholera in Bangladesh project substantially more cases, many of which are not detected by passive surveillance, and fewer deaths. Limitations. Calibration cannot completely overcome poor data quality, which can leave some parameters less well informed than others. Calibration techniques may perform differently under different circumstances. Conclusions. Incremental mixture importance sampling, when adapted to the context of high uncertainty, performs well. By accounting for biases in data, calibration can improve model projections of disease burden.