skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Explainable Offline‐Online Training of Neural Networks for Parameterizations: A 1D Gravity Wave‐QBO Testbed in the Small‐Data Regime
Abstract There are different strategies for training neural networks (NNs) as subgrid‐scale parameterizations. Here, we use a 1D model of the quasi‐biennial oscillation (QBO) and gravity wave (GW) parameterizations as testbeds. A 12‐layer convolutional NN that predicts GW forcings for given wind profiles, when trained offline in abig‐dataregime (100‐year), produces realistic QBOs once coupled to the 1D model. In contrast, offline training of this NN in asmall‐dataregime (18‐month) yields unrealistic QBOs. However, online re‐training of just two layers of this NN using ensemble Kalman inversion and only time‐averaged QBO statistics leads to parameterizations that yield realistic QBOs. Fourier analysis of these three NNs' kernels suggests why/how re‐training works and reveals that these NNs primarily learn low‐pass, high‐pass, and a combination of band‐pass filters, potentially related to the local and non‐local dynamics in GW propagation and dissipation. These findings/strategies generally apply to data‐driven parameterizations of other climate processes.  more » « less
Award ID(s):
2005123 2004512
PAR ID:
10558227
Author(s) / Creator(s):
; ;
Publisher / Repository:
AGU
Date Published:
Journal Name:
Geophysical Research Letters
Volume:
51
Issue:
2
ISSN:
0094-8276
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Neural networks (NNs) are increasingly used for data‐driven subgrid‐scale parameterizations in weather and climate models. While NNs are powerful tools for learning complex non‐linear relationships from data, there are several challenges in using them for parameterizations. Three of these challenges are (a) data imbalance related to learning rare, often large‐amplitude, samples; (b) uncertainty quantification (UQ) of the predictions to provide an accuracy indicator; and (c) generalization to other climates, for example, those with different radiative forcings. Here, we examine the performance of methods for addressing these challenges using NN‐based emulators of the Whole Atmosphere Community Climate Model (WACCM) physics‐based gravity wave (GW) parameterizations as a test case. WACCM has complex, state‐of‐the‐art parameterizations for orography‐, convection‐, and front‐driven GWs. Convection‐ and orography‐driven GWs have significant data imbalance due to the absence of convection or orography in most grid points. We address data imbalance using resampling and/or weighted loss functions, enabling the successful emulation of parameterizations for all three sources. We demonstrate that three UQ methods (Bayesian NNs, variational auto‐encoders, and dropouts) provide ensemble spreads that correspond to accuracy during testing, offering criteria for identifying when an NN gives inaccurate predictions. Finally, we show that the accuracy of these NNs decreases for a warmer climate (4 × CO2). However, their performance is significantly improved by applying transfer learning, for example, re‐training only one layer using ∼1% new data from the warmer climate. The findings of this study offer insights for developing reliable and generalizable data‐driven parameterizations for various processes, including (but not limited to) GWs. 
    more » « less
  2. Abstract Gravity waves (GWs) make crucial contributions to the middle atmospheric circulation. Yet, their climate model representation remains inaccurate, leading to key circulation biases. This study introduces a set of three neural networks (NNs) that learn to predict GW fluxes (GWFs) from multiple years of high‐resolution ERA5 reanalysis. The three NNs: a ANN, a ANN‐CNN, and an Attention UNet embed different levels of horizontal nonlocality in their architecture and are capable of representing nonlocal GW effects that are missing from current operational GW parameterizations. The NNs are evaluated offline on both time‐averaged statistics and time‐evolving flux variability. All NNs, especially the Attention UNet, accurately recreate the global GWF distribution in both the troposphere and the stratosphere. Moreover, the Attention UNet most skillfully predicts the transient evolution of GWFs over prominent orographic and nonorographic hotspots, with the model being a close second. Since even ERA5 does not resolve a substantial portion of GWFs, this deficiency is compensated by subsequently applying transfer learning on the ERA5‐trained ML models for GWFs from a 1.4 km global climate model. It is found that the re‐trained models both (a) preserve their learning from ERA5, and (b) learn to appropriately scale the predicted fluxes to account for ERA5's limited resolution. Our results highlight the importance of embedding nonlocal information for a more accurate GWF prediction and establish strategies to complement abundant reanalysis data with limited high‐resolution data to develop machine learning‐driven parameterizations for missing mesoscale processes in climate models. 
    more » « less
  3. Abstract Model instability remains a core challenge for data‐driven parameterizations, especially those developed with supervised algorithms, and rigorous methods to address it are lacking. Here, by integrating machine learning (ML) theory with climate physics, we demonstrate the importance of learning spatiallynon‐localdynamics using a 1D quasi‐biennial oscillation model with parameterized gravity waves (GW) as a testbed. While common offline metrics fail to identify shortcomings in learning non‐local dynamics, we show that the receptive field (RF) can identify instability a‐priori. We find that neural network‐based parameterizations, though predicting GW forcings from wind profiles with 99% accuracy, lead to unstable simulations when RFs are too small to capture non‐local dynamics. Additionally, we demonstrate that learning non‐local dynamics is crucial for the stability of a data‐driven spatiotemporalemulatorof the zonal wind field. This work underscores the need to integrate ML theory with physics in designing data‐driven algorithms for climate modeling. 
    more » « less
  4. Abstract Subgrid‐scale processes, such as atmospheric gravity waves (GWs), play a pivotal role in shaping the Earth's climate but cannot be explicitly resolved in climate models due to limitations on resolution. Instead, subgrid‐scale parameterizations are used to capture their effects. Recently, machine learning (ML) has emerged as a promising approach to learn parameterizations. In this study, we explore uncertainties associated with a ML parameterization for atmospheric GWs. Focusing on the uncertainties in the training process (parametric uncertainty), we use an ensemble of neural networks to emulate an existing GW parameterization. We estimate both offline uncertainties in raw NN output and online uncertainties in climate model output, after the neural networks are coupled. We find that online parametric uncertainty contributes a significant source of uncertainty in climate model output that must be considered when introducing NN parameterizations. This uncertainty quantification provides valuable insights into the reliability and robustness of ML‐based GW parameterizations, thus advancing our understanding of their potential applications in climate modeling. 
    more » « less
  5. Yortsos, Yannis (Ed.)
    Abstract Transfer learning (TL), which enables neural networks (NNs) to generalize out-of-distribution via targeted re-training, is becoming a powerful tool in scientific machine learning (ML) applications such as weather/climate prediction and turbulence modeling. Effective TL requires knowing (1) how to re-train NNs? and (2) what physics are learned during TL? Here, we present novel analyses and a framework addressing (1)–(2) for a broad range of multi-scale, nonlinear, dynamical systems. Our approach combines spectral (e.g. Fourier) analyses of such systems with spectral analyses of convolutional NNs, revealing physical connections between the systems and what the NN learns (a combination of low-, high-, band-pass filters and Gabor filters). Integrating these analyses, we introduce a general framework that identifies the best re-training procedure for a given problem based on physics and NN theory. As test case, we explain the physics of TL in subgrid-scale modeling of several setups of 2D turbulence. Furthermore, these analyses show that in these cases, the shallowest convolution layers are the best to re-train, which is consistent with our physics-guided framework but is against the common wisdom guiding TL in the ML literature. Our work provides a new avenue for optimal and explainable TL, and a step toward fully explainable NNs, for wide-ranging applications in science and engineering, such as climate change modeling. 
    more » « less