skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 10:00 PM ET on Friday, December 8 until 2:00 AM ET on Saturday, December 9 due to maintenance. We apologize for the inconvenience.


Search for: All records

Award ID contains: 2019758

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract

    Many studies have aimed to identify novel storm characteristics that are indicative of current or future severe weather potential using a combination of ground-based radar observations and severe reports. However, this is often done on a small scale using limited case studies on the order of tens to hundreds of storms due to how time-intensive this process is. Herein, we introduce the GridRad-Severe dataset, a database including ∼100 severe weather days per year and upward of 1.3 million objectively tracked storms from 2010 to 2019. Composite radar volumes spanning objectively determined, report-centered domains are created for each selected day using the GridRad compositing technique, with dates objectively determined using report thresholds defined to capture the highest-end severe weather days from each year, evenly distributed across all severe report types (tornadoes, severe hail, and severe wind). Spatiotemporal domain bounds for each event are objectively determined to encompass both the majority of reports and the time of convection initiation. Severe weather reports are matched to storms that are objectively tracked using the radar data, so the evolution of the storm cells and their severe weather production can be evaluated. Herein, we apply storm mode (single-cell, multicell, or mesoscale convective system storms) and right-moving supercell classification techniques to the dataset, and revisit various questions about severe storms and their bulk characteristics posed and evaluated in past work. Additional applications of this dataset are reviewed for possible future studies.

     
    more » « less
  2. Abstract

    We present an overview of recent work on using artificial intelligence (AI)/machine learning (ML) techniques for forecasting convective weather and its associated hazards, including tornadoes, hail, wind, and lightning. These high-impact phenomena globally cause both massive property damage and loss of life, yet they are very challenging to forecast. Given the recent explosion in developing ML techniques across the weather spectrum and the fact that the skillful prediction of convective weather has immediate societal benefits, we present a thorough review of the current state of the art in AI and ML techniques for convective hazards. Our review includes both traditional approaches, including support vector machines and decision trees, as well as deep learning approaches. We highlight the challenges in developing ML approaches to forecast these phenomena across a variety of spatial and temporal scales. We end with a discussion of promising areas of future work for ML for convective weather, including a discussion of the need to create trustworthy AI forecasts that can be used for forecasters in real time and the need for active cross-sector collaboration on testbeds to validate ML methods in operational situations.

    Significance Statement

    We provide an overview of recent machine learning research in predicting hazards from thunderstorms, specifically looking at lightning, wind, hail, and tornadoes. These hazards kill people worldwide and also destroy property and livestock. Improving the prediction of these events in both the local space as well as globally can save lives and property. By providing this review, we aim to spur additional research into developing machine learning approaches for convective hazard prediction.

     
    more » « less
  3. Abstract

    Over the past decade the use of machine learning in meteorology has grown rapidly. Specifically neural networks and deep learning have been used at an unprecedented rate. To fill the dearth of resources covering neural networks with a meteorological lens, this paper discusses machine learning methods in a plain language format that is targeted to the operational meteorological community. This is the second paper in a pair that aim to serve as a machine learning resource for meteorologists. While the first paper focused on traditional machine learning methods (e.g., random forest), here a broad spectrum of neural networks and deep learning methods is discussed. Specifically, this paper covers perceptrons, artificial neural networks, convolutional neural networks, and U-networks. Like the Part I paper, this manuscript discusses the terms associated with neural networks and their training. Then the manuscript provides some intuition behind every method and concludes by showing each method used in a meteorological example of diagnosing thunderstorms from satellite images (e.g., lightning flashes). This paper is accompanied with an open-source code repository to allow readers to explore neural networks using either the dataset provided (which is used in the paper) or as a template for alternate datasets.

     
    more » « less
  4. Abstract

    The earth system is exceedingly complex and often chaotic in nature, making prediction incredibly challenging: we cannot expect to make perfect predictions all of the time. Instead, we look for specific states of the system that lead to more predictable behavior than others, often termed “forecasts of opportunity.” When these opportunities are not present, scientists need prediction systems that are capable of saying “I don't know.” We introduce a novel loss function, termed the “NotWrong loss,” that allows neural networks to identify forecasts of opportunity for classification problems. The NotWrong loss introduces an abstention class that allows the network to identify the more confident samples and abstain (say “I don't know”) on the less confident samples. The abstention loss is designed to abstain on a user‐defined fraction of the samples via a standard adaptive controller. Unlike many machine learning methods used to reject samples post‐training, the NotWrong loss is applied during training to preferentially learn from the more confident samples. We show that the NotWrong loss outperforms other existing loss functions for multiple climate use cases. The implementation of the proposed loss function is straightforward in most network architectures designed for classification as it only requires the addition of an abstention class to the output layer and modification of the loss function.

     
    more » « less
  5. Abstract

    The earth system is exceedingly complex and often chaotic in nature, making prediction incredibly challenging: we cannot expect to make perfect predictions all of the time. Instead, we look for specific states of the system that lead to more predictable behavior than others, often termed “forecasts of opportunity.” When these opportunities are not present, scientists need prediction systems that are capable of saying “I don't know.” We introduce a novel loss function, termed “abstention loss,” that allows neural networks to identify forecasts of opportunity for regression problems. The abstention loss works by incorporating uncertainty in the network's prediction to identify the more confident samples and abstain (say “I don't know”) on the less confident samples. The abstention loss is designed to determine the optimal abstention fraction, or abstain on a user‐defined fraction using a standard adaptive controller. Unlike many methods for attaching uncertainty to neural network predictions post‐training, the abstention loss is applied during training to preferentially learn from the more confident samples. The abstention loss is built upon nonlinear heteroscedastic regression, a standard computer science method. While nonlinear heteroscedastic regression is a simple yet powerful tool for incorporating uncertainty in regression problems, we demonstrate that the abstention loss outperforms it for the synthetic climate use cases explored here. The implementation of the proposed abstention loss is straightforward in most network architectures designed for regression, as it only requires modification of the output layer and loss function.

     
    more » « less
  6. Abstract

    Predicting the timing and location of thunderstorms (“convection”) allows for preventive actions that can save both lives and property. We have applied U-nets, a deep-learning-based type of neural network, to forecast convection on a grid at lead times up to 120 min. The goal is to make skillful forecasts with only present and past satellite data as predictors. Specifically, predictors are multispectral brightness-temperature images from theHimawari-8satellite, while targets (ground truth) are provided by weather radars in Taiwan. U-nets are becoming popular in atmospheric science due to their advantages for gridded prediction. Furthermore, we use three novel approaches to advance U-nets in atmospheric science. First, we compare three architectures—vanilla, temporal, and U-net++—and find that vanilla U-nets are best for this task. Second, we train U-nets with the fractions skill score, which is spatially aware, as the loss function. Third, because we do not have adequate ground truth over the fullHimawari-8domain, we train the U-nets with small radar-centered patches, then apply trained U-nets to the full domain. Also, we find that the best predictions are given by U-nets trained with satellite data from multiple lag times, not only the present. We evaluate U-nets in detail—by time of day, month, and geographic location—and compare them to persistence models. The U-nets outperform persistence at lead times ≥ 60 min, and at all lead times the U-nets provide a more realistic climatology than persistence. Our code is available publicly.

     
    more » « less
  7. Abstract While convective storm mode is explicitly depicted in convection-allowing model (CAM) output, subjectively diagnosing mode in large volumes of CAM forecasts can be burdensome. In this work, four machine learning (ML) models were trained to probabilistically classify CAM storms into one of three modes: supercells, quasi-linear convective systems, and disorganized convection. The four ML models included a dense neural network (DNN), logistic regression (LR), a convolutional neural network (CNN) and semi-supervised CNN-Gaussian mixture model (GMM). The DNN, CNN, and LR were trained with a set of hand-labeled CAM storms, while the semi-supervised GMM used updraft helicity and storm size to generate clusters which were then hand labeled. When evaluated using storms withheld from training, the four classifiers had similar ability to discriminate between modes, but the GMM had worse calibration. The DNN and LR had similar objective performance to the CNN, suggesting that CNN-based methods may not be needed for mode classification tasks. The mode classifications from all four classifiers successfully approximated the known climatology of modes in the U.S., including a maximum in supercell occurrence in the U.S. Central Plains. Further, the modes also occurred in environments recognized to support the three different storm morphologies. Finally, storm mode provided useful information about hazard type, e.g., storm reports were most likely with supercells, further supporting the efficacy of the classifiers. Future applications, including the use of objective CAM mode classifications as a novel predictor in ML systems, could potentially lead to improved forecasts of convective hazards. 
    more » « less
    Free, publicly-accessible full text available May 5, 2024
  8. Abstract Many of our generation’s most pressing environmental science problems are wicked problems, which means they cannot be cleanly isolated and solved with a single ‘correct’ answer (e.g., Rittel 1973; Wirz 2021). The NSF AI Institute for Research on Trustworthy AI in Weather, Climate, and Coastal Oceanography (AI2ES) seeks to address such problems by developing synergistic approaches with a team of scientists from three disciplines: environmental science (including atmospheric, ocean, and other physical sciences), AI, and social science including risk communication. As part of our work, we developed a novel approach to summer school, held from June 27-30, 2022. The goal of this summer school was to teach a new generation of environmental scientists how to cross disciplines and develop approaches that integrate all three disciplinary perspectives and approaches in order to solve environmental science problems. In addition to a lecture series that focused on the synthesis of AI, environmental science, and risk communication, this year’s summer school included a unique Trust-a-thon component where participants gained hands-on experience applying both risk communication and explainable AI techniques to pre-trained ML models. We had 677 participants from 63 countries register and attend online. Lecture topics included trust and trustworthiness (Day 1), explainability and interpretability (Day 2), data and workflows (Day 3), and uncertainty quantification (Day 4). For the Trust-a-thon we developed challenge problems for three different application domains: (1) severe storms, (2) tropical cyclones, and (3) space weather. Each domain had associated user persona to guide user-centered development. 
    more » « less
    Free, publicly-accessible full text available April 14, 2024
  9. Mechanisms that generate subseasonal (1-2 months) events of sea level rise along the western Gulf Coast are investigated using the data collected by a dense tide gauge network: Texas Coastal Ocean Observation Network (TCOON) and National Water Level Observation Network (NWLON), satellite altimetry, and high-resolution (0.08°) ocean reanalysis product. In particular, the role of Loop Current and eddy shedding in generating the extreme sea level rise along the coast is emphasized. The time series of sea level anomalies along the western portion of the Gulf Coast derived from the TCOON and NWLON tide gauge data indicate that a subseasonal sea level rise which exceeds 15 cm is observed once in every 2-5 years. Based on the analysis of satellite altimetry data and high-resolution ocean reanalysis product, it is found that most of such extreme subseasonal events are originated from the anti-cyclonic (warm-core) eddy separated from the Loop Current which propagates westward. A prominent sea level rise is generated when the eddy reaches the western Gulf Coast, which occurs about 6-8 months after the formation of strong anti-cyclonic eddy in the central Gulf of Mexico. The results demonstrate that the accurate prediction of subseasonal sea level rise events along the Gulf Coast with the lead time of several months require a full description of large-scale ocean dynamical processes in the entire Gulf of Mexico including the characteristics of eddies separated from the Loop Current. 
    more » « less
    Free, publicly-accessible full text available January 9, 2024
  10. Abstract A simple method for adding uncertainty to neural network regression tasks in earth science via estimation of a general probability distribution is described. Specifically, we highlight the sinh-arcsinh-normal distributions as particularly well suited for neural network uncertainty estimation. The methodology supports estimation of heteroscedastic, asymmetric uncertainties by a simple modification of the network output and loss function. Method performance is demonstrated by predicting tropical cyclone intensity forecast uncertainty and by comparing two other common methods for neural network uncertainty quantification (i.e., Bayesian neural networks and Monte Carlo dropout). The simple approach described here is intuitive and applicable when no prior exists and one just wishes to parameterize the output and its uncertainty according to some previously defined family of distributions. The authors believe it will become a powerful, go-to method moving forward. 
    more » « less
    Free, publicly-accessible full text available January 1, 2024