Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
                                            Some full text articles may not yet be available without a charge during the embargo (administrative interval).
                                        
                                        
                                        
                                            
                                                
                                             What is a DOI Number?
                                        
                                    
                                
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
- 
            Abstract The quasi-biennial oscillation (QBO) is the dominant mode of variability in the equatorial stratosphere. It is characterized by alternating descending easterly and westerly jets over a period of approximately 28 months. It has long been known that the QBO interactions with the annual cycle, e.g., through variation in tropical upwelling, lead to variations in the descent rate of the jets and, resultantly, the QBO period. Understanding these interactions, however, has been hindered by the fact that conventional measures of the QBO convolve these interactions. Koopman formalism, derived from dynamical systems, allows one to decompose spatiotemporal datasets (or nonlinear systems) into spatial modes that evolve coherently with distinct frequencies. We use a data-driven approximation of the Koopman operator on zonal-mean zonal wind to find modes that correspond to the annual cycle, the QBO, and the nonlinear interactions between the two. From these modes, we establish a data-driven index for a “pure” QBO that is independent of the annual cycle and investigate how the annual cycle modulates the QBO. We begin with what is already known, quantifying the Holton–Tan effect, a nonlinear interaction between the QBO and the annual cycle of the polar stratospheric vortex. We then use the pure QBO to do something new, quantifying how the annual cycle changes the descent rate of the QBO, revealing annual variations with amplitudes comparable to the 30 m day−1mean descent rate. We compare these results to the annual variation in tropical upwelling and interpret them with a simple model. Significance StatementThe quasi-biennial oscillation (QBO) is a periodic cycle of winds in tropical atmosphere with a period of 28 months. The phase of QBO is known to influence other aspects of the atmosphere, including the polar vortex, but the magnitude of its effects and how it behaves are known to depend on the season. In this study, we use a data-driven method (called a Koopman decomposition) to quantify annual changes in the QBO and investigate their causes. We show that seasonal variations in the stratospheric upwelling play an important but incomplete role.more » « less
- 
            Abstract An intermediate-complexity general circulation model is used to disentangle changes in the large-scale zonally asymmetric circulation in response to rising greenhouse gases. Particular focus is on the anomalous ridge that develops over the Mediterranean in future climate projections, directly associated with reduced winter precipitation over the region. Specifically, we examine changes in stationary waves forced by land–sea contrast, horizontal oceanic heat fluxes, and orography, following a quadrupling of CO2. The stationary waves associated with these three drivers depend strongly on the climatological state, precluding a linear decomposition of their responses to warming. However, our modeling framework still allows a process-oriented approach to quantify the key drivers and mechanisms of the response. A combination of three similarly important mechanisms is found responsible for the rain-suppressing ridge. The first is part of a global response to warming: elongation of intermediate-scale stationary waves in response to strengthened subtropical winds aloft, previously found to account for hydroclimatic changes in southwestern North America. The second is regional: a downstream response to the North Atlantic warming hole and enhanced warming of the Eurasian landmass relative to the Atlantic Ocean. A third contribution to the Mediterranean Ridge is a phase shift of planetary wave 3, primarily associated with an altered circulation response to orographic forcing. Reduced land–sea contrast in the Mediterranean basin, previously thought to contribute substantially to Mediterranean drying, has a negligible effect in our integrations. This work offers a mechanistic analysis of the large-scale processes governing projected Mediterranean drying, lending increased understanding and credibility to climate model projections.more » « lessFree, publicly-accessible full text available October 1, 2026
- 
            Abstract In response to rising , chemistry‐climate models (CCMs) project that extratropical stratospheric ozone will increase, except around 10 and 17 km. We call the muted increases or reductions at these altitudes the “double dip.” The double dip results from surface warming (not stratospheric cooling). Using an idealized photochemical‐transport model, surface warming is found to produce the double dip via tropospheric expansion, which converts ozone‐rich stratospheric air into ozone‐poor tropospheric air. The lower dip results from expansion of the extratropical troposphere, as previously understood. The upper dip results from expansion of the tropical troposphere, low‐ozone anomalies from which are then transported into the extratropics. Large seasonality in the double dip in CCMs can be explained, at least in part, by seasonality in the stratospheric overturning circulation. The remote effects of the tropical tropopause on extratropical ozone complicate the use of (local) tropopause‐following coordinates to remove the effects of global warming.more » « lessFree, publicly-accessible full text available May 16, 2026
- 
            Abstract Atmospheric blocking is characterized by persistent anticyclones that “block” the midlatitude jet stream, causing temperature and precipitation extremes. The traffic jam theory posits that blocking events occur when the Local Wave Activity flux, a measure of storm activity, exceeds the carrying capacity of the jet stream, leading to a pile up. The theory's efficacy for prediction is tested with atmospheric reanalysis by defining “exceedance events”, the time and location where wave activity first exceeds flow capacity. The theory captures the Northern Hemisphere winter blocking climatology, with strong spatial correlation between exceedance and blocking events. Both events are favored not only by low carrying capacity (narrow roads), but also a downstream reduction in capacity (lane closures causing a bottleneck). The theory fails, however, to accurately predict blocking events in time. Exceedance events are not a useful predictor of an imminent block, suggesting that confounding factors explain their shared climatological structure.more » « less
- 
            Abstract Blocking events are an important cause of extreme weather, especially long‐lasting blocking events that trap weather systems in place. The duration of blocking events is, however, underestimated in climate models. Explainable Artificial Intelligence are a class of data analysis methods that can help identify physical causes of prolonged blocking events and diagnose model deficiencies. We demonstrate this approach on an idealized quasigeostrophic (QG) model developed by Marshall and Molteni (1993),https://doi.org/10.1175/1520‐0469(1993)050<1792:taduop>2.0.co;2. We train a convolutional neural network (CNN), and subsequently, build a sparse predictive model for the persistence of Atlantic blocking, conditioned on an initial high‐pressure anomaly. Shapley Additive ExPlanation (SHAP) analysis reveals that high‐pressure anomalies in the American Southeast and North Atlantic, separated by a trough over Atlantic Canada, contribute significantly to prediction of sustained blocking events in the Atlantic region. This agrees with previous work that identified precursors in the same regions via wave train analysis. When we apply the same CNN to blockings in the ERA5 atmospheric reanalysis, there is insufficient data to accurately predict persistent blocks. We partially overcome this limitation by pre‐training the CNN on the plentiful data of the Marshall‐Molteni model, and then using Transfer learning (TL) to achieve better predictions than direct training. SHAP analysis before and after TL allows a comparison between the predictive features in the reanalysis and the QG model, quantifying dynamical biases in the idealized model. This work demonstrates the potential for machine learning methods to extract meaningful precursors of extreme weather events and achieve better prediction using limited observational data.more » « less
- 
            Abstract We train random and boosted forests, two machine learning architectures based on regression trees, to emulate a physics‐based parameterization of atmospheric gravity wave momentum transport. We compare the forests to a neural network benchmark, evaluating both offline errors and online performance when coupled to an atmospheric model under the present day climate and in 800 and 1,200 ppm CO2global warming scenarios. Offline, the boosted forest exhibits similar skill to the neural network, while the random forest scores significantly lower. Both forest models couple stably to the atmospheric model, and control climate integrations with the boosted forest exhibit lower biases than those with the neural network. Integrations with all three data‐driven emulators successfully capture the Quasi‐Biennial Oscillation (QBO) and sudden stratospheric warmings, key modes of stratospheric variability, with the boosted forest more accurate than the random forest in replicating their statistics across our range of carbon dioxide perturbations. The boosted forest and neural network capture the sign of the QBO period response to increased CO2, though both struggle with the magnitude of this response under the more extreme 1,200 ppm scenario. To investigate the connection between performance in the control climate and the ability to generalize, we use techniques from interpretable machine learning to understand how the data‐driven methods use physical information. We leverage this understanding to develop a retraining procedure that improves the coupled performance of the boosted forest in the control climate and under the 800 ppm CO2scenario.more » « less
- 
            Abstract Neural networks (NNs) are increasingly used for data‐driven subgrid‐scale parameterizations in weather and climate models. While NNs are powerful tools for learning complex non‐linear relationships from data, there are several challenges in using them for parameterizations. Three of these challenges are (a) data imbalance related to learning rare, often large‐amplitude, samples; (b) uncertainty quantification (UQ) of the predictions to provide an accuracy indicator; and (c) generalization to other climates, for example, those with different radiative forcings. Here, we examine the performance of methods for addressing these challenges using NN‐based emulators of the Whole Atmosphere Community Climate Model (WACCM) physics‐based gravity wave (GW) parameterizations as a test case. WACCM has complex, state‐of‐the‐art parameterizations for orography‐, convection‐, and front‐driven GWs. Convection‐ and orography‐driven GWs have significant data imbalance due to the absence of convection or orography in most grid points. We address data imbalance using resampling and/or weighted loss functions, enabling the successful emulation of parameterizations for all three sources. We demonstrate that three UQ methods (Bayesian NNs, variational auto‐encoders, and dropouts) provide ensemble spreads that correspond to accuracy during testing, offering criteria for identifying when an NN gives inaccurate predictions. Finally, we show that the accuracy of these NNs decreases for a warmer climate (4 × CO2). However, their performance is significantly improved by applying transfer learning, for example, re‐training only one layer using ∼1% new data from the warmer climate. The findings of this study offer insights for developing reliable and generalizable data‐driven parameterizations for various processes, including (but not limited to) GWs.more » « less
- 
            Abstract Two key challenges in the development of data‐driven gravity‐wave parameterizations are generalization, how to ensure that a data‐driven scheme trained on the present‐day climate will continue to work in a new climate regime, and calibration, how to account for biases in the “host” climate model. Both problems depend fundamentally on the response to out‐of‐sample inputs compared with the training dataset, and are often conflicting. The ability to generalize to new climate regimes often goes hand in hand with sensitivity to model biases. To probe these challenges, we employ a one‐dimensional (1D) quasibiennial oscillation (QBO) model with a stochastic source term that represents convectively generated gravity waves in the Tropics with randomly varying strengths and spectra. We employ an array of machine‐learning models consisting of a fully connected feed‐forward neural network, a dilated convolutional neural network, an encoder–decoder, a boosted forest, and a support‐vector regression model. Our results demonstrate that data‐driven schemes trained on “observations” can be critically sensitive to model biases in the wave sources. While able to emulate accurately the stochastic source term on which they were trained, all of our schemes fail to simulate fully the expected QBO period or amplitude, even with the slightest perturbation to the wave sources. The main takeaway is that some measures will always be required to ensure the proper response to climate change and to account for model biases. We examine one approach based on the ideas of optimal transport, where the wave sources in the model are first remapped to the observed one before applying the data‐driven scheme. This approach is agnostic to the data‐driven method and guarantees that the model adheres to the observational constraints, making sure the model yields the right results for the right reasons.more » « less
- 
            Abstract While a poleward shift of the near-surface jet and storm track in response to increased greenhouse gases appears to be robust, the magnitude of this change is uncertain and differs across models, and the mechanisms for this change are poorly constrained. An intermediate complexity GCM is used in this study to explore the factors governing the magnitude of the poleward shift and the mechanisms involved. The degree to which parameterized subgrid-scale convection is inhibited has a leading-order effect on the poleward shift, with a simulation with more convection (and less large-scale precipitation) simulating a significantly weaker shift, and eventually no shift at all if convection is strongly preferred over large-scale precipitation. Many of the physical processes proposed to drive the poleward shift are equally active in all simulations (even those with no poleward shift). Hence, we can conclude that these mechanisms are not of leading-order significance for the poleward shift in any of the simulations. The thermodynamic budget, however, provides useful insight into differences in the jet and storm track response among the simulations. It helps identify midlatitude moisture and latent heat release as a crucial differentiator. These results have implications for intermodel spread in the jet, hydrological cycle, and storm track response to increased greenhouse gases in intermodel comparison projects.more » « less
- 
            Abstract Extreme weather events have significant consequences, dominating the impact of climate on society. While high‐resolution weather models can forecast many types of extreme events on synoptic timescales, long‐term climatological risk assessment is an altogether different problem. A once‐in‐a‐century event takes, on average, 100 years of simulation time to appear just once, far beyond the typical integration length of a weather forecast model. Therefore, this task is left to cheaper, but less accurate, low‐resolution or statistical models. But there is untapped potential in weather model output: despite being short in duration, weather forecast ensembles are produced multiple times a week. Integrations are launched with independent perturbations, causing them to spread apart over time and broadly sample phase space. Collectively, these integrations add up to thousands of years of data. We establish methods to extract climatological information from these short weather simulations. Using ensemble hindcasts by the European Center for Medium‐range Weather Forecasting archived in the subseasonal‐to‐seasonal (S2S) database, we characterize sudden stratospheric warming (SSW) events with multi‐centennial return times. Consistent results are found between alternative methods, including basic counting strategies and Markov state modeling. By carefully combining trajectories together, we obtain estimates of SSW frequencies and their seasonal distributions that are consistent with reanalysis‐derived estimates for moderately rare events, but with much tighter uncertainty bounds, and which can be extended to events of unprecedented severity that have not yet been observed historically. These methods hold potential for assessing extreme events throughout the climate system, beyond this example of stratospheric extremes.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
