Climate models are generally calibrated manually by comparing selected climate statistics, such as the global top‐of‐atmosphere energy balance, to observations. The manual tuning only targets a limited subset of observational data and parameters. Bayesian calibration can estimate climate model parameters and their uncertainty using a larger fraction of the available data and automatically exploring the parameter space more broadly. In Bayesian learning, it is natural to exploit the seasonal cycle, which has large amplitude compared with anthropogenic climate change in many climate statistics. In this study, we develop methods for the calibration and uncertainty quantification (UQ) of model parameters exploiting the seasonal cycle, and we demonstrate a proof‐of‐concept with an idealized general circulation model (GCM). UQ is performed using the calibrate‐emulate‐sample approach, which combines stochastic optimization and machine learning emulation to speed up Bayesian learning. The methods are demonstrated in a perfect‐model setting through the calibration and UQ of a convective parameterization in an idealized GCM with a seasonal cycle. Calibration and UQ based on seasonally averaged climate statistics, compared to annually averaged, reduces the calibration error by up to an order of magnitude and narrows the spread of the non‐Gaussian posterior distributions by factors between two and five, depending on the variables used for UQ. The reduction in the spread of the parameter posterior distribution leads to a reduction in the uncertainty of climate model predictions.
- Award ID(s):
- 2008970
- PAR ID:
- 10356229
- Date Published:
- Journal Name:
- ESAIM: Mathematical Modelling and Numerical Analysis
- Volume:
- 55
- Issue:
- 1
- ISSN:
- 0764-583X
- Page Range / eLocation ID:
- 131 to 169
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
null (Ed.)Uncertainty quantification (UQ) is an important part of mathematical modeling and simulations, which quantifies the impact of parametric uncertainty on model predictions. This paper presents an efficient approach for polynomial chaos expansion (PCE) based UQ method in biological systems. For PCE, the key step is the stochastic Galerkin (SG) projection, which yields a family of deterministic models of PCE coefficients to describe the original stochastic system. When dealing with systems that involve nonpolynomial terms and many uncertainties, the SG-based PCE is computationally prohibitive because it often involves high-dimensional integrals. To address this, a generalized dimension reduction method (gDRM) is coupled with quadrature rules to convert a high-dimensional integral in the SG into a few lower dimensional ones that can be rapidly solved. The performance of the algorithm is validated with two examples describing the dynamic behavior of cells. Compared to other UQ techniques (e.g., nonintrusive PCE), the results show the potential of the algorithm to tackle UQ in more complicated biological systems.more » « less
-
We revisit the variational characterization of conservative di↵usion as entropic gra- dient flow and provide for it a probabilistic interpretation based on stochastic calculus. It was shown by Jordan, Kinderlehrer, and Otto that, for diffusions of Langevin–Smoluchowski type, the Fokker–Planck probability density flow maximizes the rate of relative entropy dissipation, as mea- sured by the distance traveled in the ambient space of probability measures with finite second moments, in terms of the quadratic Wasserstein metric. We obtain novel, stochastic-process ver- sions of these features, valid along almost every trajectory of the dffusive motion in the backwards direction of time, using a very direct perturbation analysis. By averaging our trajectorial results with respect to the underlying measure on path space, we establish the maximal rate of entropy dissipation along the Fokker–Planck flow and measure exactly the deviation from this maximum that corresponds to any given perturbation. A bonus of our trajectorial approach is that it derives the HWI inequality relating relative entropy (H), Wasserstein distance (W), and relative Fisher information (I).more » « less
-
Abstract Thermodynamic speed limits are a set of classical uncertainty relations that, so far, place global bounds on the stochastic dissipation of energy as heat and the production of entropy. Here, instead of constraints on these thermodynamic costs, we derive integral speed limits that are upper and lower bounds on a thermodynamic benefit—the minimum time for an amount of mechanical work to be done on or by a system. In the short time limit, we show how this extrinsic timescale relates to an intrinsic timescale for work, recovering the intrinsic timescales in differential speed limits from these integral speed limits and turning the first law of stochastic thermodynamics into a first law of speeds. As physical examples, we consider the work done by a flashing Brownian ratchet and the work done on a particle in a potential well subject to external driving.
-
Abstract The Mass Spectrometer and Incoherent Scatter radar (MSIS) model family has been developed and improved since the early 1970's. The most recent version of MSIS is the Naval Research Laboratory (NRL) MSIS 2.0 empirical atmospheric model. NRLMSIS 2.0 provides species density, mass density, and temperature estimates as function of location and space weather conditions. MSIS models have long been a popular choice of thermosphere model in the research and operations community alike, but—like many models—does not provide uncertainty estimates. In this work, we develop an exospheric temperature model based in machine learning that can be used with NRLMSIS 2.0 to calibrate it relative to high‐fidelity satellite density estimates directly through the exospheric temperature parameter. Instead of providing point estimates, our model (called MSIS‐UQ) outputs a distribution which is assessed using a metric called the calibration error score. We show that MSIS‐UQ debiases NRLMSIS 2.0 resulting in reduced differences between model and satellite density of 25% and is 11% closer to satellite density than the Space Force's High Accuracy Satellite Drag Model. We also show the model's uncertainty estimation capabilities by generating altitude profiles for species density, mass density, and temperature. This explicitly demonstrates how exospheric temperature probabilities affect density and temperature profiles within NRLMSIS 2.0. Another study displays improved post‐storm overcooling capabilities relative to NRLMSIS 2.0 alone, enhancing the phenomena that it can capture.