Gaussian processes (GPs) provide flexible distributions over functions, with inductive biases controlled by a kernel. However, in many applications Gaussian processes can struggle with even moderate input dimensionality. Learning a low dimensional projection can help alleviate this curse of dimensionality, but introduces many trainable hyperparameters, which can be cumbersome, especially in the small data regime. We use additive sums of kernels for GP regression, where each kernel operates on a different random projection of its inputs. Surprisingly, we find that as the number of random projections increases, the predictive performance of this approach quickly converges to the performance of a kernel operating on the original full dimensional inputs, over a wide range of data sets, even if we are projecting into a single dimension. As a consequence, many problems can remarkably be reduced to one dimensional input spaces, without learning a transformation. We prove this convergence and its rate, and additionally propose a deterministic approach that converges more quickly than purely random projections. Moreover, we demonstrate our approach can achieve faster inference and improved predictive accuracy for high-dimensional inputs compared to kernels in the original input space.
more »
« less
Local Projection Inference Is Simpler and More Robust Than You Think
Applied macroeconomists often compute confidence intervals for impulse responses using local projections, that is, direct linear regressions of future outcomes on current covariates. This paper proves that local projection inference robustly handles two issues that commonly arise in applications: highly persistent data and the estimation of impulse responses at long horizons. We consider local projections that control for lags of the variables in the regression. We show that lag‐augmented local projections with normal critical values are asymptotically valid uniformly over (i) both stationary and non‐stationary data, and also over (ii) a wide range of response horizons. Moreover, lag augmentation obviates the need to correct standard errors for serial correlation in the regression residuals. Hence, local projection inference is arguably both simpler than previously thought and more robust than standard autoregressive inference, whose validity is known to depend sensitively on the persistence of the data and on the length of the horizon.
more »
« less
- Award ID(s):
- 1851665
- PAR ID:
- 10281004
- Date Published:
- Journal Name:
- Econometrica
- Volume:
- 89
- Issue:
- 4
- ISSN:
- 0012-9682
- Page Range / eLocation ID:
- 1789 to 1823
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)We prove that local projections (LPs) and Vector Autoregressions (VARs) estimate the same impulse responses. This nonparametric result only requires unrestricted lag structures. We discuss several implications: (i) LP and VAR estimators are not conceptually separate procedures; instead, they are simply two dimension reduction techniques with common estimand but different finite‐sample properties. (ii) VAR‐based structural identification—including short‐run, long‐run, or sign restrictions—can equivalently be performed using LPs, and vice versa. (iii) Structural estimation with an instrument (proxy) can be carried out by ordering the instrument first in a recursive VAR, even under noninvertibility. (iv) Linear VARs are as robust to nonlinearities as linear LPs.more » « less
-
Abstract Despite widespread use of radio-echo sounding (RES) in glaciology and broad distribution of processed radar products, the glaciological community has no standard software for processing impulse RES data. Dependable, fast and collection-system/platform-independent processing flows could facilitate comparison between datasets and allow full utilization of large impulse RES data archives and new data. Here, we present ImpDAR, an open-source, cross-platform, impulse radar processor and interpreter, written primarily in Python. The utility of this software lies in its collection of established tools into a single, open-source framework. ImpDAR aims to provide a versatile standard that is accessible to radar-processing novices and useful to specialists. It can read data from common commercial ground-penetrating radars (GPRs) and some custom-built RES systems. It performs all the standard processing steps, including bandpass and horizontal filtering, time correction for antenna spacing, geolocation and migration. After processing data, ImpDAR's interpreter includes several plotting functions, digitization of reflecting horizons, calculation of reflector strength and export of interpreted layers. We demonstrate these capabilities on two datasets: deep (~3000 m depth) data collected with a custom (3 MHz) system in northeast Greenland and shallow (<100 m depth, 500 MHz) data collected with a commercial GPR on South Cascade Glacier in Washington.more » « less
-
When multiple measures are collected repeatedly over time, redundancy typically exists among responses. The envelope method was recently proposed to reduce the dimension of responses without loss of information in regression with multivariate responses. It can gain substantial efficiency over the standard least squares estimator. In this paper, we generalize the envelope method to mixed effects models for longitudinal data with possibly unbalanced design and time‐varying predictors. We show that our model provides more efficient estimators than the standard estimators in mixed effects models. Improved accuracy and efficiency of the proposed method over the standard mixed effects model estimator are observed in both the simulations and the Action to Control Cardiovascular Risk in Diabetes (ACCORD) study.more » « less
-
null (Ed.)Projection-free conditional gradient (CG) methods are the algorithms of choice for constrained optimization setups in which projections are often computationally prohibitive but linear optimization over the constraint set remains computationally feasible. Unlike in projection-based methods, globally accelerated convergence rates are in general unattainable for CG. However, a very recent work on Locally accelerated CG (LaCG) has demonstrated that local acceleration for CG is possible for many settings of interest. The main downside of LaCG is that it requires knowledge of the smoothness and strong convexity parameters of the objective function. We remove this limitation by introducing a novel, Parameter-Free Locally accelerated CG (PF-LaCG) algorithm, for which we provide rigorous convergence guarantees. Our theoretical results are complemented by numerical experiments, which demonstrate local acceleration and showcase the practical improvements of PF-LaCG over non-accelerated algorithms, both in terms of iteration count and wall-clock time.more » « less
An official website of the United States government

