The impacts of climate change are felt by most critical systems, such as infrastructure, ecological systems, and power-plants. However, contemporary Earth System Models (ESM) are run at spatial resolutions too coarse for assessing effects this localized. Local scale projections can be obtained using statistical downscaling, a technique which uses historical climate observations to learn a low-resolution to high-resolution mapping. The spatio-temporal nature of the climate system motivates the adaptation of super-resolution image processing techniques to statistical downscaling. In our work, we present DeepSD, a generalized stacked super resolution convolutional neural network (SRCNN) framework with multi-scale input channels for statistical downscaling of climate variables. A comparison of DeepSD to four state-of-the-art methods downscaling daily precipitation from 1 degree (~100km) to 1/8 degrees (~12.5km) over the Continental United States. Furthermore, a framework using the NASA Earth Exchange (NEX) platform is discussed for downscaling more than 20 ESM models with multiple emission scenarios.
more » « less- Award ID(s):
- 1735505
- NSF-PAR ID:
- 10111432
- Date Published:
- Journal Name:
- International Joint Conferences on Artificial Intelligence Organization
- Page Range / eLocation ID:
- 5389 to 5393
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
null (Ed.)Climate data from Earth System Models are increasingly being used to study the impacts of climate change on a broad range of biogeophysical (forest fires, fisheries, etc.) and human systems (reservoir operations, urban heat waves, etc.). Before this data can be used to study many of these systems, post-processing steps commonly referred to as bias correction and statistical downscaling must be performed. “Bias correction” is used to correct persistent biases in climate model output and “statistical downscaling” is used to increase the spatiotemporal resolution of the model output (i.e. 1 deg to 1/16th deg grid boxes). For our purposes, we’ll refer to both parts as “downscaling”. In the past few decades, the applications community has developed a plethora of downscaling methods. Many of these methods are ad-hoc collections of post processing routines while others target very specific applications. The proliferation of downscaling methods has left the climate applications community with an overwhelming body of research to sort through without much in the form of synthesis guiding method selection or applicability. Motivated by the pressing socio-environmental challenges of climate change – and with the learnings from previous downscaling efforts in mind – we have begun working on a community-centered open framework for climate downscaling: scikit-downscale. We believe that the community will benefit from the presence of a well-designed open source downscaling toolbox with standard interfaces alongside a repository of benchmark data to test and evaluate new and existing downscaling methods. In this notebook, we provide an overview of the scikit-downscale project, detailing how it can be used to downscale a range of surface climate variables such as air temperature and precipitation. We also highlight how scikit-downscale framework is being used to compare existing methods and how it can be extended to support the development of new downscaling methods.more » « less
-
Earth system models (ESMs) are the primary tool used to understand and project changes to the climate system. ESM projections underpin analyses of human dimensions of the climate issue, yet little is known about how ESMs are used in human dimensions research. Such foundational information is necessary for future critical assessments of ESMs. We review applications of a leading ESM, the National Center for Atmospheric Research (NCAR) Community Earth System Model (CESM), to human dimensions topics since 2004. We find that this research has grown substantially over this period, twice as fast as CESM research overall. Although many studies have primarily addressed long‐term impacts on physical systems with societal relevance, applications to managed, societal, and ecological systems have grown quickly and now make up more than half of CESM human dimensions work. CESM applications focused nearly equally on global and regional analyses, most often using multimodel ensembles, although the use of single simulations remains prevalent. Downscaling and bias correction of output was infrequent and most common for regional studies. U.S.‐based, university‐affiliated authors primarily drove human dimensions work using CESM, with only 12% of authors based at NCAR. Our findings identify important questions that warrant further investigation, such as reasons for the infrequent use of downscaling and bias correction techniques; motivations to continue to use older model versions after newer model versions have been released; and model development needs for improved human dimensions applications. Additionally, our synthesis provides a baseline and framework that enables continued tracking of CESM and other ESMs.
This article is categorized under:
Assessing Impacts of Climate Change > Evaluating Future Impacts of Climate Change
-
Climate and weather data such as precipitation derived from Global Climate Models (GCMs) and satellite observations are essential for the global and local hydrological assessment. However, most climatic popular precipitation products (with spatial resolutions coarser than 10km) are too coarse for local impact studies and require “downscaling” to obtain higher resolutions. Traditional precipitation downscaling methods such as statistical and dynamic downscaling require an input of additional meteorological variables, and very few are applicable for downscaling hourly precipitation for higher spatial resolution. Based on dynamic dictionary learning, we propose a new downscaling method, PreciPatch, to address this challenge by producing spatially distributed higher resolution precipitation fields with only precipitation input from GCMs at hourly temporal resolution and a large geographical extent. Using aggregated Integrated Multi-satellitE Retrievals for GPM (IMERG) data, an experiment was conducted to evaluate the performance of PreciPatch, in comparison with bicubic interpolation using RainFARM—a stochastic downscaling method, and DeepSD—a Super-Resolution Convolutional Neural Network (SRCNN) based downscaling method. PreciPatch demonstrates better performance than other methods for downscaling short-duration precipitation events (used historical data from 2014 to 2017 as the training set to estimate high-resolution hourly events in 2018).more » « less
-
Abstract The decline in snowpack across the western United States is one of the most pressing threats posed by climate change to regional economies and livelihoods. Earth system models are important tools for exploring past and future snowpack variability, yet their coarse spatial resolutions distort local topography and bias spatial patterns of accumulation and ablation. Here, we explore pattern-based statistical downscaling for spatially-continuous interannual snowpack estimates. We find that a few leading patterns capture the majority of snowpack variability across the western US in observations, reanalyses, and free-running simulations. Pattern-based downscaling methods yield accurate, high resolution maps that correct mean and variance biases in domain-wide simulated snowpack. Methods that use large-scale patterns as both predictors and predictands perform better than those that do not and all are superior to an interpolation-based “delta change” approach. These findings suggest that pattern-based methods are appropriate for downscaling interannual snowpack variability and that using physically meaningful large-scale patterns is more important than the details of any particular downscaling method.
-
Abstract Cities need climate information to develop resilient infrastructure and for adaptation decisions. The information desired is at the order of magnitudes finer scales relative to what is typically available from climate analysis and future projections. Urban downscaling refers to developing such climate information at the city (order of 1 – 10 km) and neighborhood (order of 0.1 – 1 km) resolutions from coarser climate products. Developing these higher resolution (finer grid spacing) data needed for assessments typically covering multiyear climatology of past data and future projections is complex and computationally expensive for traditional physics-based dynamical models. In this study, we develop and adopt a novel approach for urban downscaling by generating a general-purpose operator using deep learning. This ‘DownScaleBench’ tool can aid the process of downscaling to any location. The DownScaleBench has been generalized for both in situ (ground- based) and satellite or reanalysis gridded data. The algorithm employs an iterative super-resolution convolutional neural network (Iterative SRCNN) over the city. We apply this for the development of a high-resolution gridded precipitation product (300 m) from a relatively coarse (10 km) satellite-based product (JAXA GsMAP). The high-resolution gridded precipitation datasets is compared against insitu observations for past heavy rain events over Austin, Texas, and shows marked improvement relative to the coarser datasets relative to cubic interpolation as a baseline. The creation of this Downscaling Bench has implications for generating high-resolution gridded urban meteorological datasets and aiding the planning process for climate-ready cities.