Abstract Global climate models (GCMs) and Earth system models (ESMs) exhibit biases, with resolutions too coarse to capture local variability for fine-scale, reliable drought and climate impact assessment. However, conventional bias correction approaches may cause implausible climate change signals due to unrealistic representations of spatial and intervariable dependences. While purely data-driven deep learning has achieved significant progress in improving climate and earth system simulations and predictions, they cannot reliably learn the circumstances (e.g., extremes) that are largely unseen in historical climate but likely becoming more frequent in the future climate (i.e., climate non-stationarity). This study shows an integrated trend-preserving deep learning approach that can address the spatial and intervariable dependences and climate non-stationarity issues for downscaling and bias correcting GCMs/ESMs. Here we combine the super-resolution deep residual network (SRDRN) with the trend-preserving quantile delta mapping (QDM) to downscale and bias correct six primary climate variables at once (including daily precipitation, maximum temperature, minimum temperature, relative humidity, solar radiation, and wind speed) from five state-of-the-art GCMs/ESMs in the Coupled Model Intercomparison Project Phase 6 (CMIP6). We found that the SRDRN-QDM approach greatly reduced GCMs/ESMs biases in spatial and intervariable dependences while significantly better-reducing biases in extremes compared to deep learning. The estimated drought based on the six bias-corrected and downscaled variables captured the observed drought intensity and frequency, which outperformed state-of-the-art multivariate bias correction approaches, demonstrating its capability for correcting GCMs/ESMs biases in spatial and multivariable dependences and extremes.
more »
« less
Scikit-downscale: an open source Python package for scalable climate downscaling
Climate data from Earth System Models are increasingly being used to study the impacts of climate change on a broad range of biogeophysical (forest fires, fisheries, etc.) and human systems (reservoir operations, urban heat waves, etc.). Before this data can be used to study many of these systems, post-processing steps commonly referred to as bias correction and statistical downscaling must be performed. “Bias correction” is used to correct persistent biases in climate model output and “statistical downscaling” is used to increase the spatiotemporal resolution of the model output (i.e. 1 deg to 1/16th deg grid boxes). For our purposes, we’ll refer to both parts as “downscaling”. In the past few decades, the applications community has developed a plethora of downscaling methods. Many of these methods are ad-hoc collections of post processing routines while others target very specific applications. The proliferation of downscaling methods has left the climate applications community with an overwhelming body of research to sort through without much in the form of synthesis guiding method selection or applicability. Motivated by the pressing socio-environmental challenges of climate change – and with the learnings from previous downscaling efforts in mind – we have begun working on a community-centered open framework for climate downscaling: scikit-downscale. We believe that the community will benefit from the presence of a well-designed open source downscaling toolbox with standard interfaces alongside a repository of benchmark data to test and evaluate new and existing downscaling methods. In this notebook, we provide an overview of the scikit-downscale project, detailing how it can be used to downscale a range of surface climate variables such as air temperature and precipitation. We also highlight how scikit-downscale framework is being used to compare existing methods and how it can be extended to support the development of new downscaling methods.
more »
« less
- Award ID(s):
- 1937136
- PAR ID:
- 10284442
- Date Published:
- Journal Name:
- 2020 EarthCube Annual Meeting
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract—Numerical simulation of weather is resolution-constrained due to the high computational cost of integrating the coupled PDEs that govern atmospheric motion. For example, the most highly-resolved numerical weather prediction models are limited to approximately 3 km. However many weather and climate impacts occur over much finer scales, especially in urban areas and regions with high topographic complexity like mountains or coastal regions. Thus several statistical methods have been developed in the climate community to downscale numerical model output to finer resolutions. This is conceptually similar to image super-resolution (SR) [1] and in this work we report the results of applying SR methods to the downscaling problem. In particular we test the extent to which a SR method based on a Generative Adversarial Network (GAN) can recover a grid of wind speed from an artificially downsampled version, compared against a standard bicubic upsampling approach and another machine learning based approach, SR-CNN [1]. We use ESRGAN ([2]) to learn to downscale wind speeds by a factor of 4 from a coarse grid. We find that we can recover spatial details with higher fidelity than bicubic upsampling or SR-CNN. The bicubic and SR-CNN methods perform better than ESRGAN on coarse metrics such as MSE. However, the high frequency power spectrum is captured remarkably well by the ESRGAN, virtually identical to the real data, while bicubic and SR-CNN fidelity drops significantly at high frequency. This indicates that SR is considerably better at matching the higher-order statistics of the dataset, consistent with the observation that the generated images are of superior visual quality compared with SR-CNN.more » « less
-
Abstract. Systematic biases and coarse resolutions are major limitations ofcurrent precipitation datasets. Many deep learning (DL)-based studies havebeen conducted for precipitation bias correction and downscaling. However,it is still challenging for the current approaches to handle complexfeatures of hourly precipitation, resulting in the incapability ofreproducing small-scale features, such as extreme events. This studydeveloped a customized DL model by incorporating customized loss functions,multitask learning and physically relevant covariates to bias correct anddownscale hourly precipitation data. We designed six scenarios tosystematically evaluate the added values of weighted loss functions,multitask learning, and atmospheric covariates compared to the regular DLand statistical approaches. The models were trained and tested using theModern-era Retrospective Analysis for Research and Applications version 2(MERRA2) reanalysis and the Stage IV radar observations over the northerncoastal region of the Gulf of Mexico on an hourly time scale. We found thatall the scenarios with weighted loss functions performed notably better thanthe other scenarios with conventional loss functions and a quantilemapping-based approach at hourly, daily, and monthly time scales as well asextremes. Multitask learning showed improved performance on capturing finefeatures of extreme events and accounting for atmospheric covariates highlyimproved model performance at hourly and aggregated time scales, while theimprovement is not as large as from weighted loss functions. We show thatthe customized DL model can better downscale and bias correct hourlyprecipitation datasets and provide improved precipitation estimates at finespatial and temporal resolutions where regular DL and statistical methodsexperience challenges.more » « less
-
The impacts of climate change are felt by most critical systems, such as infrastructure, ecological systems, and power-plants. However, contemporary Earth System Models (ESM) are run at spatial resolutions too coarse for assessing effects this localized. Local scale projections can be obtained using statistical downscaling, a technique which uses historical climate observations to learn a low-resolution to high-resolution mapping. The spatio-temporal nature of the climate system motivates the adaptation of super-resolution image processing techniques to statistical downscaling. In our work, we present DeepSD, a generalized stacked super resolution convolutional neural network (SRCNN) framework with multi-scale input channels for statistical downscaling of climate variables. A comparison of DeepSD to four state-of-the-art methods downscaling daily precipitation from 1 degree (~100km) to 1/8 degrees (~12.5km) over the Continental United States. Furthermore, a framework using the NASA Earth Exchange (NEX) platform is discussed for downscaling more than 20 ESM models with multiple emission scenarios.more » « less
-
Abstract The decline in snowpack across the western United States is one of the most pressing threats posed by climate change to regional economies and livelihoods. Earth system models are important tools for exploring past and future snowpack variability, yet their coarse spatial resolutions distort local topography and bias spatial patterns of accumulation and ablation. Here, we explore pattern-based statistical downscaling for spatially-continuous interannual snowpack estimates. We find that a few leading patterns capture the majority of snowpack variability across the western US in observations, reanalyses, and free-running simulations. Pattern-based downscaling methods yield accurate, high resolution maps that correct mean and variance biases in domain-wide simulated snowpack. Methods that use large-scale patterns as both predictors and predictands perform better than those that do not and all are superior to an interpolation-based “delta change” approach. These findings suggest that pattern-based methods are appropriate for downscaling interannual snowpack variability and that using physically meaningful large-scale patterns is more important than the details of any particular downscaling method.more » « less
An official website of the United States government

