skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Generating High Resolution Climate Change Projections through Single Image Super-Resolution: An Abridged Version
The impacts of climate change are felt by most critical systems, such as infrastructure, ecological systems, and power-plants. However, contemporary Earth System Models (ESM) are run at spatial resolutions too coarse for assessing effects this localized. Local scale projections can be obtained using statistical downscaling, a technique which uses historical climate observations to learn a low-resolution to high-resolution mapping. The spatio-temporal nature of the climate system motivates the adaptation of super-resolution image processing techniques to statistical downscaling. In our work, we present DeepSD, a generalized stacked super resolution convolutional neural network (SRCNN) framework with multi-scale input channels for statistical downscaling of climate variables. A comparison of DeepSD to four state-of-the-art methods downscaling daily precipitation from 1 degree (~100km) to 1/8 degrees (~12.5km) over the Continental United States. Furthermore, a framework using the NASA Earth Exchange (NEX) platform is discussed for downscaling more than 20 ESM models with multiple emission scenarios.  more » « less
Award ID(s):
1735505
PAR ID:
10111432
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
International Joint Conferences on Artificial Intelligence Organization
Page Range / eLocation ID:
5389 to 5393
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    Climate data from Earth System Models are increasingly being used to study the impacts of climate change on a broad range of biogeophysical (forest fires, fisheries, etc.) and human systems (reservoir operations, urban heat waves, etc.). Before this data can be used to study many of these systems, post-processing steps commonly referred to as bias correction and statistical downscaling must be performed. “Bias correction” is used to correct persistent biases in climate model output and “statistical downscaling” is used to increase the spatiotemporal resolution of the model output (i.e. 1 deg to 1/16th deg grid boxes). For our purposes, we’ll refer to both parts as “downscaling”. In the past few decades, the applications community has developed a plethora of downscaling methods. Many of these methods are ad-hoc collections of post processing routines while others target very specific applications. The proliferation of downscaling methods has left the climate applications community with an overwhelming body of research to sort through without much in the form of synthesis guiding method selection or applicability. Motivated by the pressing socio-environmental challenges of climate change – and with the learnings from previous downscaling efforts in mind – we have begun working on a community-centered open framework for climate downscaling: scikit-downscale. We believe that the community will benefit from the presence of a well-designed open source downscaling toolbox with standard interfaces alongside a repository of benchmark data to test and evaluate new and existing downscaling methods. In this notebook, we provide an overview of the scikit-downscale project, detailing how it can be used to downscale a range of surface climate variables such as air temperature and precipitation. We also highlight how scikit-downscale framework is being used to compare existing methods and how it can be extended to support the development of new downscaling methods. 
    more » « less
  2. Abstract Extreme winds associated with tropical cyclones (TCs) can cause significant loss of life and economic damage globally, highlighting the need for accurate, high‐resolution modeling and forecasting for wind. However, due to their coarse horizontal resolution, most global climate and weather models suffer from chronic underprediction of TC wind speeds, limiting their use for impact analysis and energy modeling. In this study, we introduce a cascading deep learning framework designed to downscale high‐resolution TC wind fields given low‐resolution data. Our approach maps 85 TC events from ERA5 data (0.25° resolution) to high‐resolution (0.05° resolution) observations at 6‐hr intervals. The initial component is a debiasing neural network designed to model accurate wind speed observations using ERA5 data. The second component employs a generative super‐resolution strategy based on a conditional denoising diffusion probabilistic model (DDPM) to enhance the spatial resolution and to produce ensemble estimates. The model is able to accurately model intensity and produce realistic radial profiles and fine‐scale spatial structures of wind fields, with a percentage mean bias of −3.74% compared to the high‐resolution observations. Our downscaling framework enables the prediction of high‐resolution wind fields using widely available low‐resolution and intensity wind data, allowing for the modeling of past events and the assessment of future TC risks. 
    more » « less
  3. Abstract The decline in snowpack across the western United States is one of the most pressing threats posed by climate change to regional economies and livelihoods. Earth system models are important tools for exploring past and future snowpack variability, yet their coarse spatial resolutions distort local topography and bias spatial patterns of accumulation and ablation. Here, we explore pattern-based statistical downscaling for spatially-continuous interannual snowpack estimates. We find that a few leading patterns capture the majority of snowpack variability across the western US in observations, reanalyses, and free-running simulations. Pattern-based downscaling methods yield accurate, high resolution maps that correct mean and variance biases in domain-wide simulated snowpack. Methods that use large-scale patterns as both predictors and predictands perform better than those that do not and all are superior to an interpolation-based “delta change” approach. These findings suggest that pattern-based methods are appropriate for downscaling interannual snowpack variability and that using physically meaningful large-scale patterns is more important than the details of any particular downscaling method. 
    more » « less
  4. Climate and weather data such as precipitation derived from Global Climate Models (GCMs) and satellite observations are essential for the global and local hydrological assessment. However, most climatic popular precipitation products (with spatial resolutions coarser than 10km) are too coarse for local impact studies and require “downscaling” to obtain higher resolutions. Traditional precipitation downscaling methods such as statistical and dynamic downscaling require an input of additional meteorological variables, and very few are applicable for downscaling hourly precipitation for higher spatial resolution. Based on dynamic dictionary learning, we propose a new downscaling method, PreciPatch, to address this challenge by producing spatially distributed higher resolution precipitation fields with only precipitation input from GCMs at hourly temporal resolution and a large geographical extent. Using aggregated Integrated Multi-satellitE Retrievals for GPM (IMERG) data, an experiment was conducted to evaluate the performance of PreciPatch, in comparison with bicubic interpolation using RainFARM—a stochastic downscaling method, and DeepSD—a Super-Resolution Convolutional Neural Network (SRCNN) based downscaling method. PreciPatch demonstrates better performance than other methods for downscaling short-duration precipitation events (used historical data from 2014 to 2017 as the training set to estimate high-resolution hourly events in 2018). 
    more » « less
  5. Abstract Cities need climate information to develop resilient infrastructure and for adaptation decisions. The information desired is at the order of magnitudes finer scales relative to what is typically available from climate analysis and future projections. Urban downscaling refers to developing such climate information at the city (order of 1 – 10 km) and neighborhood (order of 0.1 – 1 km) resolutions from coarser climate products. Developing these higher resolution (finer grid spacing) data needed for assessments typically covering multiyear climatology of past data and future projections is complex and computationally expensive for traditional physics-based dynamical models. In this study, we develop and adopt a novel approach for urban downscaling by generating a general-purpose operator using deep learning. This ‘DownScaleBench’ tool can aid the process of downscaling to any location. The DownScaleBench has been generalized for both in situ (ground- based) and satellite or reanalysis gridded data. The algorithm employs an iterative super-resolution convolutional neural network (Iterative SRCNN) over the city. We apply this for the development of a high-resolution gridded precipitation product (300 m) from a relatively coarse (10 km) satellite-based product (JAXA GsMAP). The high-resolution gridded precipitation datasets is compared against insitu observations for past heavy rain events over Austin, Texas, and shows marked improvement relative to the coarser datasets relative to cubic interpolation as a baseline. The creation of this Downscaling Bench has implications for generating high-resolution gridded urban meteorological datasets and aiding the planning process for climate-ready cities. 
    more » « less