skip to main content


Search for: All records

Creators/Authors contains: "Campbell, Trevor"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available August 1, 2024
  2. Abstract

    Severe convection occurring in high-shear, low-CAPE (HSLC) environments is a common cool-season threat in the southeastern United States. Previous studies of HSLC convection document the increased operational challenges that these environments present compared to their high-CAPE counterparts, corresponding to higher false-alarm ratios and lower probability of detection for severe watches and warnings. These environments can exhibit rapid destabilization in the hours prior to convection, sometimes associated with the release of potential instability. Here, we use self-organizing maps (SOMs) to objectively identify environmental patterns accompanying HSLC cool-season severe events and associate them with variations in severe weather frequency and distribution. Large-scale patterns exhibit modest variation within the HSLC subclass, featuring strong surface cyclones accompanied by vigorous upper-tropospheric troughs and northward-extending regions of instability, consistent with prior studies. In most patterns, severe weather occurs immediately ahead of a cold front. Other convective ingredients, such as lower-tropospheric vertical wind shear, near-surface equivalent potential temperature (θe) advection, and the release of potential instability, varied more significantly across patterns. No single variable used to train SOMs consistently demonstrated differences in the distribution of severe weather occurrence across patterns. Comparison of SOMs based on upper and lower quartiles of severe occurrence demonstrated that the release of potential instability was most consistently associated with higher-impact events in comparison to other convective ingredients. Overall, we find that previously developed HSLC composite parameters reasonably identify high-impact HSLC events.

    Significance Statement

    Even when atmospheric instability is not optimal for severe convective storms, in some situations they can still occur, presenting increased challenges to forecasters. These marginal environments may occur at night or during the cool season, when people are less attuned to severe weather threats. Here, we use a sorting algorithm to classify different weather patterns accompanying such storms, and we distinguish which specific patterns and weather system features are most strongly associated with severe storms. Our goals are to increase situational awareness for forecasters and to improve understanding of the processes leading to severe convection in marginal environments.

     
    more » « less
  3. Chiappa, Silvia ; Calandra, Roberto (Ed.)
  4. Kernel methods offer the flexibility to learn complex relationships in modern, large data sets while enjoying strong theoretical guarantees on quality. Unfortunately, these methods typically require cubic running time in the data set size, a prohibitive cost in the large-data setting. Random feature maps (RFMs) and the Nyström method both consider low-rank approximations to the kernel matrix as a potential solution. But, in order to achieve desirable theoretical guarantees, the former may require a prohibitively large number of features J+, and the latter may be prohibitively expensive for high-dimensional problems. We propose to combine the simplicity and generality of RFMs with a data-dependent feature selection scheme to achieve desirable theoretical approximation properties of Nyström with just O(\log J+) features. Our key insight is to begin with a large set of random features, then reduce them to a small number of weighted features in a data-dependent, computationally efficient way, while preserving the statistical guarantees of using the original large set of features. We demonstrate the efficacy of our method with theory and experiments-including on a data set with over 50 million observations. In particular, we show that our method achieves small kernel matrix approximation error and better test set accuracy with provably fewer random features than state-of-the-art methods. 
    more » « less
  5. Gaussian processes (GPs) offer a flexible class of priors for nonparametric Bayesian regression, but popular GP posterior inference methods are typically prohibitively slow or lack desirable finite-data guarantees on quality. We develop a scalable approach to approximate GP regression, with finite-data guarantees on the accuracy of our pointwise posterior mean and variance estimates. Our main contribution is a novel objective for approximate inference in the nonparametric setting: the preconditioned Fisher (pF) divergence. We show that unlike the Kullback–Leibler divergence (used in variational inference), the pF divergence bounds bounds the 2-Wasserstein distance, which in turn provides tight bounds on the pointwise error of mean and variance estimates. We demonstrate that, for sparse GP likelihood approximations, we can minimize the pF divergence bounds efficiently. Our experiments show that optimizing the pF divergence bounds has the same computational requirements as variational sparse GPs while providing comparable empirical performance—in addition to our novel finite-data quality guarantees. 
    more » « less