Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
An exponential rise in the atmospheric vapour pressure deficit (VPD) is among the most consequential impacts of climate change in terrestrial ecosystems. Rising VPD has negative and cascading effects on nearly all aspects of plant function including photosynthesis, water status, growth and survival. These responses are exacerbated by land–atmosphere interactions that couple VPD to soil water and govern the evolution of drought, affecting a range of ecosystem services including carbon uptake, biodiversity, the provisioning of water resources and crop yields. However, despite the global nature of this phenomenon, research on how to incorporate these impacts into resilient management regimes is largely in its infancy, due in part to the entanglement of VPD trends with those of other co-evolving climate drivers. Here, we review the mechanistic bases of VPD impacts at a range of spatial scales, paying particular attention to the independent and interactive influence of VPD in the context of other environmental changes. We then evaluate the consequences of these impacts within key management contexts, including water resources, croplands, wildfire risk mitigation and management of natural grasslands and forests. We conclude with recommendations describing how management regimes could be altered to mitigate the otherwise highly deleterious consequences of rising VPD.more » « lessFree, publicly-accessible full text available September 1, 2025
-
Free, publicly-accessible full text available September 1, 2025
-
Abstract Global storm-resolving models (GSRMs) have gained widespread interest because of the unprecedented detail with which they resolve the global climate. However, it remains difficult to quantify objective differences in how GSRMs resolve complex atmospheric formations. This lack of comprehensive tools for comparing model similarities is a problem in many disparate fields that involve simulation tools for complex data. To address this challenge we develop methods to estimate distributional distances based on both nonlinear dimensionality reduction and vector quantization. Our approach automatically learns physically meaningful notions of similarity from low-dimensional latent data representations that the different models produce. This enables an intercomparison of nine GSRMs based on their high-dimensional simulation data (2D vertical velocity snapshots) and reveals that only six are similar in their representation of atmospheric dynamics. Furthermore, we uncover signatures of the convective response to global warming in a fully unsupervised way. Our study provides a path toward evaluating future high-resolution simulation data more objectively.
-
Projecting climate change is a generalization problem: We extrapolate the recent past using physical models across past, present, and future climates. Current climate models require representations of processes that occur at scales smaller than model grid size, which have been the main source of model projection uncertainty. Recent machine learning (ML) algorithms hold promise to improve such process representations but tend to extrapolate poorly to climate regimes that they were not trained on. To get the best of the physical and statistical worlds, we propose a framework, termed “climate-invariant” ML, incorporating knowledge of climate processes into ML algorithms, and show that it can maintain high offline accuracy across a wide range of climate conditions and configurations in three distinct atmospheric models. Our results suggest that explicitly incorporating physical knowledge into data-driven models of Earth system processes can improve their consistency, data efficiency, and generalizability across climate regimes.
Free, publicly-accessible full text available February 7, 2025 -
Physical parameterizations (or closures) are used as representations of unresolved subgrid processes within weather and global climate models or coarse-scale turbulent models, whose resolutions are too coarse to resolve small-scale processes. These parameterizations are typically grounded on physically based, yet empirical, representations of the underlying small-scale processes. Machine learning-based parameterizations have recently been proposed as an alternative solution and have shown great promise to reduce uncertainties associated with the parameterization of small-scale processes. Yet, those approaches still show some important mismatches that are often attributed to the stochasticity of the considered process. This stochasticity can be due to coarse temporal resolution, unresolved variables, or simply to the inherent chaotic nature of the process. To address these issues, we propose a new type of parameterization (closure), which is built using memory-based neural networks, to account for the non-instantaneous response of the closure and to enhance its stability and prediction accuracy. We apply the proposed memory-based parameterization, with differentiable solver, to the Lorenz ’96 model in the presence of a coarse temporal resolution and show its capacity to predict skillful forecasts over a long time horizon of the resolved variables compared to instantaneous parameterizations. This approach paves the way for the use of memory-based parameterizations for closure problems.
-
Free, publicly-accessible full text available June 12, 2025
-
Abstract We provide a global, long-term carbon flux dataset of gross primary production and ecosystem respiration generated using meta-learning, called
MetaFlux . The idea behind meta-learning stems from the need to learn efficiently given sparse data by learning how to learn broad features across tasks to better infer other poorly sampled ones. Using meta-trained ensemble of deep models, we generate global carbon products on daily and monthly timescales at a 0.25-degree spatial resolution from 2001 to 2021, through a combination of reanalysis and remote-sensing products. Site-level validation finds that MetaFlux ensembles have lower validation error by 5–7% compared to their non-meta-trained counterparts. In addition, they are more robust to extreme observations, with 4–24% lower errors. We also checked for seasonality, interannual variability, and correlation to solar-induced fluorescence of the upscaled product and found that MetaFlux outperformed other machine-learning based carbon product, especially in the tropics and semi-arids by 10–40%. Overall, MetaFlux can be used to study a wide range of biogeochemical processes.