Precipitation, especially convective precipitation, is highly associated with hydrological disasters (e.g., floods and drought) that have negative impacts on agricultural productivity, society, and the environment. To mitigate these negative impacts, it is crucial to monitor the precipitation status in real time. The new Advanced Baseline Imager (ABI) onboard the GOES-16 satellite provides such a precipitation product in higher spatiotemporal and spectral resolutions, especially during the daytime. This research proposes a deep neural network (DNN) method to classify rainy and non-rainy clouds based on the brightness temperature differences (BTDs) and reflectances (Ref) derived from ABI. Convective and stratiform rain clouds are also separated using similar spectral parameters expressing the characteristics of cloud properties. The precipitation events used for training and validation are obtained from the IMERG V05B data, covering the southeastern coast of the U.S. during the 2018 rainy season. The performance of the proposed method is compared with traditional machine learning methods, including support vector machines (SVMs) and random forest (RF). For rainy area detection, the DNN method outperformed the other methods, with a critical success index (CSI) of 0.71 and a probability of detection (POD) of 0.86. For convective precipitation delineation, the DNN models also show a better performance, with a CSI of 0.58 and POD of 0.72. This automatic cloud classification system could be deployed for extreme rainfall event detection, real-time forecasting, and decision-making support in rainfall-related disasters.
more »
« less
Low Cloud Detection in Multilayer Scenes Using Satellite Imagery with Machine Learning Methods
Abstract The detection of multilayer clouds in the atmosphere can be particularly challenging from passive visible and infrared imaging radiometers since cloud boundary information is limited primarily to the topmost cloud layer. Yet detection of low clouds in the atmosphere is important for a number of applications, including aviation nowcasting and general weather forecasting. In this work, we develop pixel-based machine learning–based methods of detecting low clouds, with a focus on improving detection in multilayer cloud situations and specific attention given to improving the Cloud Cover Layers (CCL) product, which assigns cloudiness in a scene into vertical bins. The random forest (RF) and neural network (NN) implementations use inputs from a variety of sources, including GOES Advanced Baseline Imager (ABI) visible radiances, infrared brightness temperatures, auxiliary information about the underlying surface, and relative humidity (which holds some utility as a cloud proxy). Training and independent validation enlists near-global, actively sensed cloud boundaries from the radar and lidar systems on board theCloudSatandCALIPSOsatellites. We find that the RF and NN models have similar performances. The probability of detection (PoD) of low cloud increases from 0.685 to 0.815 when using the RF technique instead of the CCL methodology, while the false alarm ratio decreases. The improved PoD of low cloud is particularly notable for scenes that appear to be cirrus from an ABI perspective, increasing from 0.183 to 0.686. Various extensions of the model are discussed, including a nighttime-only algorithm and expansion to other satellite sensors. Significance StatementUsing satellites to detect the heights of clouds in the atmosphere is important for a variety of weather applications, including aviation weather forecasting. However, detecting low clouds can be challenging if there are other clouds above them. To address this, we have developed machine learning–based models that can be used with passive satellite instruments. These models use satellite observations at visible and infrared wavelengths, an estimate of relative humidity in the atmosphere, and geographic and surface-type information to predict whether low clouds are present. Our results show that these models have significant skill at predicting low clouds, even in the presence of higher cloud layers.
more »
« less
- Award ID(s):
- 2019758
- PAR ID:
- 10512805
- Publisher / Repository:
- AMS Journals
- Date Published:
- Journal Name:
- Journal of Atmospheric and Oceanic Technology
- Volume:
- 39
- Issue:
- 3
- ISSN:
- 0739-0572
- Page Range / eLocation ID:
- 319 to 334
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract The terrestrial carbon cycle varies dynamically on hourly to weekly scales, making it difficult to observe. Geostationary (“weather”) satellites like the Geostationary Environmental Operational Satellite - R Series (GOES-R) deliver near-hemispheric imagery at a ten-minute cadence. The Advanced Baseline Imager (ABI) aboard GOES-R measures visible and near-infrared spectral bands that can be used to estimate land surface properties and carbon dioxide flux. However, GOES-R data are designed for real-time dissemination and are difficult to link with eddy covariance time series of land-atmosphere carbon dioxide exchange. We compiled three-year time series of GOES-R land surface attributes including visible and near-infrared reflectances, land surface temperature (LST), and downwelling shortwave radiation (DSR) at 314 ABI fixed grid pixels containing eddy covariance towers. We demonstrate how to best combine satellite andin-situdatasets and show how ABI attributes useful for ecosystem monitoring vary across space and time. By connecting observation networks that infer rapid changes to the carbon cycle, we can gain a richer understanding of the processes that control it.more » « less
-
Abstract. An ability to accurately detect convective regions isessential for initializing models for short-term precipitation forecasts.Radar data are commonly used to detect convection, but radars that providehigh-temporal-resolution data are mostly available over land, and the qualityof the data tends to degrade over mountainous regions. On the other hand,geostationary satellite data are available nearly anywhere and in near-realtime. Current operational geostationary satellites, the GeostationaryOperational Environmental Satellite-16 (GOES-16) and Satellite-17, provide high-spatial- and high-temporal-resolution data but only of cloud top properties; 1 min data, however, allow us to observe convection from visible andinfrared data even without vertical information of the convective system.Existing detection algorithms using visible and infrared data look forstatic features of convective clouds such as overshooting top or lumpy cloudtop surface or cloud growth that occurs over periods of 30 min to anhour. This study represents a proof of concept that artificial intelligence(AI) is able, when given high-spatial- and high-temporal-resolution data fromGOES-16, to learn physical properties of convective clouds and automate thedetection process. A neural network model with convolutional layers is proposed to identifyconvection from the high-temporal resolution GOES-16 data. The model takesfive temporal images from channel 2 (0.65 µm) and 14 (11.2 µm) asinputs and produces a map of convective regions. In order to provideproducts comparable to the radar products, it is trained against Multi-RadarMulti-Sensor (MRMS), which is a radar-based product that uses a rathersophisticated method to classify precipitation types. Two channels fromGOES-16, each related to cloud optical depth (channel 2) and cloud topheight (channel 14), are expected to best represent features of convectiveclouds: high reflectance, lumpy cloud top surface, and low cloud toptemperature. The model has correctly learned those features of convectiveclouds and resulted in a reasonably low false alarm ratio (FAR) and highprobability of detection (POD). However, FAR and POD can vary depending onthe threshold, and a proper threshold needs to be chosen based on thepurpose.more » « less
-
null (Ed.)Abstract. Environmental science is increasingly reliant on remotely sensedobservations of the Earth's surface and atmosphere. Observations frompolar-orbiting satellites have long supported investigations on land coverchange, ecosystem productivity, hydrology, climate, the impacts ofdisturbance, and more and are critical for extrapolating (upscaling)ground-based measurements to larger areas. However, the limited temporalfrequency at which polar-orbiting satellites observe the Earth limits ourunderstanding of rapidly evolving ecosystem processes, especially in areaswith frequent cloud cover. Geostationary satellites have observed theEarth's surface and atmosphere at high temporal frequency for decades, andtheir imagers now have spectral resolutions in the visible and near-infrared regions that are comparable to commonly used polar-orbiting sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS), Visible Infrared Imaging Radiometer Suite (VIIRS), or Landsat. These advances extend applications of geostationary Earth observations from weather monitoring to multiple disciplines in ecology and environmental science. We review a number of existing applications that use data from geostationary platforms and present upcoming opportunities for observing key ecosystem properties using high-frequency observations from the Advanced Baseline Imagers (ABI) on the Geostationary Operational Environmental Satellites (GOES), which routinely observe the Western Hemisphere every 5–15 min. Many of the existing applications in environmental science from ABI are focused on estimating land surface temperature, solar radiation, evapotranspiration, and biomass burning emissions along with detecting rapid drought development and wildfire. Ongoing work in estimating vegetation properties and phenology from other geostationary platforms demonstrates the potential to expand ABI observations to estimate vegetation greenness, moisture, and productivity at a high temporal frequency across the Western Hemisphere. Finally, we present emerging opportunities to address the relatively coarseresolution of ABI observations through multisensor fusion to resolvelandscape heterogeneity and to leverage observations from ABI to study thecarbon cycle and ecosystem function at unprecedented temporal frequency.more » « less
-
null (Ed.)The launch of the National Oceanic and Atmospheric Administration (NOAA)/ National Aeronautics and Space Administration (NASA) Suomi National Polar-orbiting Partnership (S-NPP) and its follow-on NOAA Joint Polar Satellite Systems (JPSS) satellites marks the beginning of a new era of operational satellite observations of the Earth and atmosphere for environmental applications with high spatial resolution and sampling rate. The S-NPP and JPSS are equipped with five instruments, each with advanced design in Earth sampling, including the Advanced Technology Microwave Sounder (ATMS), the Cross-track Infrared Sounder (CrIS), the Ozone Mapping and Profiler Suite (OMPS), the Visible Infrared Imaging Radiometer Suite (VIIRS), and the Clouds and the Earth’s Radiant Energy System (CERES). Among them, the ATMS is the new generation of microwave sounder measuring temperature profiles from the surface to the upper stratosphere and moisture profiles from the surface to the upper troposphere, while CrIS is the first of a series of advanced operational hyperspectral sounders providing more accurate atmospheric and moisture sounding observations with higher vertical resolution for weather and climate applications. The OMPS instrument measures solar backscattered ultraviolet to provide information on the concentrations of ozone in the Earth’s atmosphere, and VIIRS provides global observations of a variety of essential environmental variables over the land, atmosphere, cryosphere, and ocean with visible and infrared imagery. The CERES instrument measures the solar energy reflected by the Earth, the longwave radiative emission from the Earth, and the role of cloud processes in the Earth’s energy balance. Presently, observations from several instruments on S-NPP and JPSS-1 (re-named NOAA-20 after launch) provide near real-time monitoring of the environmental changes and improve weather forecasting by assimilation into numerical weather prediction models. Envisioning the need for consistencies in satellite retrievals, improving climate reanalyses, development of climate data records, and improving numerical weather forecasting, the NOAA/Center for Satellite Applications and Research (STAR) has been reprocessing the S-NPP observations for ATMS, CrIS, OMPS, and VIIRS through their life cycle. This article provides a summary of the instrument observing principles, data characteristics, reprocessing approaches, calibration algorithms, and validation results of the reprocessed sensor data records. The reprocessing generated consistent Level-1 sensor data records using unified and consistent calibration algorithms for each instrument that removed artificial jumps in data owing to operational changes, instrument anomalies, contaminations by anomaly views of the environment or spacecraft, and other causes. The reprocessed sensor data records were compared with and validated against other observations for a consistency check whenever such data were available. The reprocessed data will be archived in the NOAA data center with the same format as the operational data and technical support for data requests. Such a reprocessing is expected to improve the efficiency of the use of the S-NPP and JPSS satellite data and the accuracy of the observed essential environmental variables through either consistent satellite retrievals or use of the reprocessed data in numerical data assimilations.more » « less
An official website of the United States government

