Summary Instrumental variable methods can identify causal effects even when the treatment and outcome are confounded. We study the problem of imperfect measurements of the binary instrumental variable, treatment and outcome. We first consider nondifferential measurement errors, that is, the mismeasured variable does not depend on other variables given its true value. We show that the measurement error of the instrumental variable does not bias the estimate, that the measurement error of the treatment biases the estimate away from zero, and that the measurement error of the outcome biases the estimate toward zero. Moreover, we derive sharp bounds on the causal effects without additional assumptions. These bounds are informative because they exclude zero. We then consider differential measurement errors, and focus on sensitivity analyses in those settings.
more »
« less
Partial Identifiability in Discrete Data with Measurement Error
When data contains measurement errors, it is necessary to make modeling assumptions relating the error-prone measurements to the unobserved true values. Work on measurement error has largely focused on models that fully identify the parameter of interest. As a result, many practically useful models that result in bounds on the target parameter -- known as partial identification -- have been neglected. In this work, we present a method for partial identification in a class of measurement error models involving discrete variables. We focus on models that impose linear constraints on the tar- get parameter, allowing us to compute partial identification bounds using off-the-shelf LP solvers. We show how several common measurement error assumptions can be composed with an extended class of instrumental variable-type models to create such linear constraint sets. We further show how this approach can be used to bound causal parameters, such as the average treatment effect, when treatment or outcome variables are measured with error. Using data from the Oregon Health Insurance Experiment, we apply this method to estimate bounds on the effect Medicaid enrollment has on depression when depression is measured with error.
more »
« less
- Award ID(s):
- 1942239
- PAR ID:
- 10329246
- Editor(s):
- Cassio de Campos; Marloes H. Maathuis
- Date Published:
- Journal Name:
- Proceedings of the Thirty Seventh Conference on Uncertainty in Artificial Intelligence
- Volume:
- 161
- Page Range / eLocation ID:
- 1798-1808
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Differential measurement error, which occurs when the error in the measured outcome is correlated with the treatment renders the causal effect unidentifiable from observational data. In this work, we study conditional differential measurement error, where a subgroup of the population is known to be prone to differential measurement error. Under an assumption about the direction (but not magnitude) of the measurement error, we derive sharp bounds on the conditional average treatment effect, and present an approach to estimate them. We empirically validate our approach on semi-synthetic da, showing that it gives more credible and informative bound than other approaches. In addition, we implement our approach on real data, showing its utility in guiding decisions about dietary modification intervals to improve nutritional intake.more » « less
-
In this paper, we study the geometry of quadratic covariance bounds on the estimation error covariance, in a properly defined Hilbert space of random variables. We show that a lower bound on the error covariance may be represented by the Grammian of the error score after projection onto the subspace spanned by the measurement scores. The Grammian is defined with respect to inner products in a Hilbert space of second order random variables. This geometric result holds for a large class of quadratic covariance bounds including the Barankin, Cramer-Rao, and Bhattacharyya bounds, where each bound is characterized by its corresponding measurement scores. When parameters consist of essential parameters and nuisance parameters, the Cram´er-Rao covariance bound is the inverse of the Grammian of essential scores after projection onto the subspace orthogonal to the subspace spanned by the nuisance scores. In two examples, we show that for complex multivariate normal measurements with parameterized mean or covariance, there exist well-known Euclidean space geometries for the general Hilbert space geometry derived in this paper.more » « less
-
Shape restrictions have played a central role in economics as both testable implications of theory and sufficient conditions for obtaining informative counterfactual predictions. In this paper, we provide a general procedure for inference under shape restrictions in identified and partially identified models defined by conditional moment restrictions. Our test statistics and proposed inference methods are based on the minimum of the generalized method of moments (GMM) objective function with and without shape restrictions. Uniformly valid critical values are obtained through a bootstrap procedure that approximates a subset of the true local parameter space. In an empirical analysis of the effect of childbearing on female labor supply, we show that employing shape restrictions in linear instrumental variables (IV) models can lead to shorter confidence regions for both local and average treatment effects. Other applications we discuss include inference for the variability of quantile IV treatment effects and for bounds on average equivalent variation in a demand model with general heterogeneity.more » « less
-
Summary We study quantile trend filtering, a recently proposed method for nonparametric quantile regression, with the goal of generalizing existing risk bounds for the usual trend-filtering estimators that perform mean regression. We study both the penalized and the constrained versions, of order $$r \geqslant 1$$, of univariate quantile trend filtering. Our results show that both the constrained and the penalized versions of order $$r \geqslant 1$$ attain the minimax rate up to logarithmic factors, when the $(r-1)$th discrete derivative of the true vector of quantiles belongs to the class of bounded-variation signals. Moreover, we show that if the true vector of quantiles is a discrete spline with a few polynomial pieces, then both versions attain a near-parametric rate of convergence. Corresponding results for the usual trend-filtering estimators are known to hold only when the errors are sub-Gaussian. In contrast, our risk bounds are shown to hold under minimal assumptions on the error variables. In particular, no moment assumptions are needed and our results hold under heavy-tailed errors. Our proof techniques are general, and thus can potentially be used to study other nonparametric quantile regression methods. To illustrate this generality, we employ our proof techniques to obtain new results for multivariate quantile total-variation denoising and high-dimensional quantile linear regression.more » « less
An official website of the United States government

