Abstract This work proposes a general framework for analyzing noise-driven transitions in spatially extended non-equilibrium systems and explaining the emergence of coherent patterns beyond the instability onset. The framework relies on stochastic parameterization formulas to reduce the complexity of the original equations while preserving the essential dynamical effects of unresolved scales. The approach is flexible and operates for both Gaussian noise and non-Gaussian noise with jumps. Our stochastic parameterization formulas offer two key advantages. First, they can approximate stochastic invariant manifolds when these manifolds exist. Second, even when such manifolds break down, our formulas can be adapted through a simple optimization of its constitutive parameters. This allows us to handle scenarios with weak time-scale separation where the system has undergone multiple transitions, resulting in large-amplitude solutions not captured by invariant manifolds or other time-scale separation methods. The optimized stochastic parameterizations capture then how small-scale noise impacts larger scales through the system’s nonlinear interactions. This effect is achieved by the very fabric of our parameterizations incorporating non-Markovian (memory-dependent) coefficients into the reduced equation. These coefficients account for the noise’s past influence, not just its current value, using a finite memory length that is selected for optimal performance. The specific memory function, which determines how this past influence is weighted, depends on both the strength of the noise and how it interacts with the system’s nonlinearities. Remarkably, training our theory-guided reduced models on a single noise path effectively learns the optimal memory length for out-of-sample predictions. This approach retains indeed good accuracy in predicting noise-induced transitions, including rare events, when tested against a large ensemble of different noise paths. This success stems from our hybrid approach, which combines analytical understanding with data-driven learning. This combination avoids a key limitation of purely data-driven methods: their struggle to generalize to unseen scenarios, also known as the ‘extrapolation problem.’
more »
« less
Optimal parameterizing manifolds for anticipating tipping points and higher-order critical transitions
A general, variational approach to derive low-order reduced models from possibly non-autonomous systems is presented. The approach is based on the concept of optimal parameterizing manifold (OPM) that substitutes more classical notions of invariant or slow manifolds when the breakdown of “slaving” occurs, i.e., when the unresolved variables cannot be expressed as an exact functional of the resolved ones anymore. The OPM provides, within a given class of parameterizations of the unresolved variables, the manifold that averages out optimally these variables as conditioned on the resolved ones. The class of parameterizations retained here is that of continuous deformations of parameterizations rigorously valid near the onset of instability. These deformations are produced through the integration of auxiliary backward–forward systems built from the model’s equations and lead to analytic formulas for parameterizations. In this modus operandi, the backward integration time is the key parameter to select per scale/variable to parameterize in order to derive the relevant parameterizations which are doomed to be no longer exact away from instability onset due to the breakdown of slaving typically encountered, e.g., for chaotic regimes. The selection criterion is then made through data-informed minimization of a least-square parameterization defect. It is thus shown through optimization of the backward integration time per scale/variable to parameterize, that skilled OPM reduced systems can be derived for predicting with accuracy higher-order critical transitions or catastrophic tipping phenomena, while training our parameterization formulas for regimes prior to these transitions takes place.
more »
« less
- Award ID(s):
- 2108856
- PAR ID:
- 10588608
- Publisher / Repository:
- American Institute of Physics
- Date Published:
- Journal Name:
- Chaos: An Interdisciplinary Journal of Nonlinear Science
- Volume:
- 33
- Issue:
- 9
- ISSN:
- 1054-1500
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The problems of identifying the slow component (e.g., for weather forecast initialization) and of characterizing slow–fast interactions are central to geophysical fluid dynamics. In this study, the related rectification problem of slow manifold closures is addressed when breakdown of slow-to-fast scales deterministic parameterizations occurs due to explosive emergence of fast oscillations on the slow, geostrophic motion. For such regimes, it is shown on the Lorenz 80 model that if 1) the underlying manifold provides a good approximation of the optimal nonlinear parameterization that averages out the fast variables and 2) the residual dynamics off this manifold is mainly orthogonal to it, then no memory terms are required in the Mori–Zwanzig full closure. Instead, the noise term is key to resolve, and is shown to be, in this case, well modeled by a state-independent noise, obtained by means of networks of stochastic nonlinear oscillators. This stochastic parameterization allows, in turn, for rectifying the momentum-balanced slow manifold, and for accurate recovery of the multiscale dynamics. The approach is promising to be further applied to the closure of other more complex slow–fast systems, in strongly coupled regimes.more » « less
-
Physical parameterizations (or closures) are used as representations of unresolved subgrid processes within weather and global climate models or coarse-scale turbulent models, whose resolutions are too coarse to resolve small-scale processes. These parameterizations are typically grounded on physically based, yet empirical, representations of the underlying small-scale processes. Machine learning-based parameterizations have recently been proposed as an alternative solution and have shown great promise to reduce uncertainties associated with the parameterization of small-scale processes. Yet, those approaches still show some important mismatches that are often attributed to the stochasticity of the considered process. This stochasticity can be due to coarse temporal resolution, unresolved variables, or simply to the inherent chaotic nature of the process. To address these issues, we propose a new type of parameterization (closure), which is built using memory-based neural networks, to account for the non-instantaneous response of the closure and to enhance its stability and prediction accuracy. We apply the proposed memory-based parameterization, with differentiable solver, to the Lorenz ’96 model in the presence of a coarse temporal resolution and show its capacity to predict skillful forecasts over a long time horizon of the resolved variables compared to instantaneous parameterizations. This approach paves the way for the use of memory-based parameterizations for closure problems.more » « less
-
Ocean mesoscale eddies are often poorly represented in climate models, and therefore, their effects on the large scale circulation must be parameterized. Traditional parameterizations, which represent the bulk effect of the unresolved eddies, can be improved with new subgrid models learned directly from data. Zanna and Bolton (ZB20) applied an equation‐discovery algorithm to reveal an interpretable expression parameterizing the subgrid momentum fluxes by mesoscale eddies through the components of the velocity‐gradient tensor. In this work, we implement the ZB20 parameterization into the primitive‐equation GFDL MOM6 ocean model and test it in two idealized configurations with significantly different dynamical regimes and topography. The original parameterization was found to generate excessive numerical noise near the grid scale. We propose two filtering approaches to avoid the numerical issues and additionally enhance the strength of large‐scale energy backscatter. The filtered ZB20 parameterizations led to improved climatological mean state and energy distributions, compared to the current state‐of‐the‐art energy backscatter parameterizations. The filtered ZB20 parameterizations are scale‐aware and, consequently, can be used with a single value of the non‐dimensional scaling coefficient for a range of resolutions. The successful application of the filtered ZB20 parameterizations to parameterize mesoscale eddies in two idealized configurations offers a promising opportunity to reduce long‐standing biases in global ocean simulations in future studies.more » « less
-
Abstract This work integrates machine learning into an atmospheric parameterization to target uncertain mixing processes while maintaining interpretable, predictive, and well‐established physical equations. We adopt an eddy‐diffusivity mass‐flux (EDMF) parameterization for the unified modeling of various convective and turbulent regimes. To avoid drift and instability that plague offline‐trained machine learning parameterizations that are subsequently coupled with climate models, we frame learning as an inverse problem: Data‐driven models are embedded within the EDMF parameterization and trained online in a one‐dimensional vertical global climate model (GCM) column. Training is performed against output from large‐eddy simulations (LES) forced with GCM‐simulated large‐scale conditions in the Pacific. Rather than optimizing subgrid‐scale tendencies, our framework directly targets climate variables of interest, such as the vertical profiles of entropy and liquid water path. Specifically, we use ensemble Kalman inversion to simultaneously calibrate both the EDMF parameters and the parameters governing data‐driven lateral mixing rates. The calibrated parameterization outperforms existing EDMF schemes, particularly in tropical and subtropical locations of the present climate, and maintains high fidelity in simulating shallow cumulus and stratocumulus regimes under increased sea surface temperatures from AMIP4K experiments. The results showcase the advantage of physically constraining data‐driven models and directly targeting relevant variables through online learning to build robust and stable machine learning parameterizations.more » « less
An official website of the United States government
