Abstract We provide a novel characterization of augmented balancing weights, also known as automatic debiased machine learning. These popular doubly robust estimators combine outcome modelling with balancing weights—weights that achieve covariate balance directly instead of estimating and inverting the propensity score. When the outcome and weighting models are both linear in some (possibly infinite) basis, we show that the augmented estimator is equivalent to a single linear model with coefficients that combine those of the original outcome model with those from unpenalized ordinary least-squares (OLS). Under certain choices of regularization parameters, the augmented estimator in fact collapses to the OLS estimator alone. We then extend these results to specific outcome and weighting models. We first show that the augmented estimator that uses (kernel) ridge regression for both outcome and weighting models is equivalent to a single, undersmoothed (kernel) ridge regression—implying a novel analysis of undersmoothing. When the weighting model is instead lasso-penalized, we demonstrate a familiar ‘double selection’ property. Our framework opens the black box on this increasingly popular class of estimators, bridges the gap between existing results on the semiparametric efficiency of undersmoothed and doubly robust estimators, and provides new insights into the performance of augmented balancing weights.
more »
« less
ALISTA: analytic weights are as good as learned weights in LISTA
- Award ID(s):
- 1720237
- PAR ID:
- 10191388
- Date Published:
- Journal Name:
- International Conference on Learning Representations (ICLR)
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract This opinion piece is part of a collection on the topic: “What is attention?” Despite the word's place in the common vernacular, a satisfying definition for “attention” remains elusive. Part of the challenge is there exist many different types of attention, which may or may not share common mechanisms. Here we review this literature and offer an intuitive definition that draws from aspects of prior theories and models of attention but is broad enough to recognize the various types of attention and modalities it acts upon: attention as a multi‐level system of weights and balances. While the specific mechanism(s) governing the weighting/balancing may vary across levels, the fundamental role of attention is to dynamically weigh and balance all signals—both externally‐generated and internally‐generated—such that the highest weighted signals are selected and enhanced. Top‐down, bottom‐up, and experience‐driven factors dynamically impact this balancing, and competition occurs both within and across multiple levels of processing. This idea of a multi‐level system of weights and balances is intended to incorporate both external and internal attention and capture their myriad of constantly interacting processes. We review key findings and open questions related to external attention guidance, internal attention and working memory, and broader attentional control (e.g., ongoing competition between external stimuli and internal thoughts) within the framework of this analogy. We also speculate about the implications of failures of attention in terms of weights and balances, ranging from momentary one‐off errors to clinical disorders, as well as attentional development and degradation across the lifespan. This article is categorized under:Psychology > AttentionNeuroscience > Cognitionmore » « less
An official website of the United States government

