skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: An Empirical Likelihood Approach to Reduce Selection Bias in Voluntary Samples
How to construct the pseudo-weights in voluntary samples is an important practical problem in survey sampling. The problem is quite challenging when the sampling mechanism for the voluntary sample is allowed to be non-ignorable. Under the assumption that the sample participation model is correctly specified, we can compute a consistent estimator of the model parameter and construct the propensity score estimator of the population mean. We propose using the empirical likelihood method to construct the final weights for voluntary samples by incorporating the bias calibration constraints and the benchmarking constraints. Linearization variance estimation of the proposed method is developed. A toy example is also presented to illustrate the idea and the computational details. A limited simulation study is also performed to evaluate the performance of the proposed methods.  more » « less
Award ID(s):
1931380
PAR ID:
10491976
Author(s) / Creator(s):
;
Publisher / Repository:
Sage Journals
Date Published:
Journal Name:
Calcutta Statistical Association Bulletin
Volume:
75
Issue:
1
ISSN:
0008-0683
Page Range / eLocation ID:
8 to 27
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We propose a new estimator for average causal effects of a binary treatment with panel data in settings with general treatment patterns. Our approach augments the popular two‐way‐fixed‐effects specification with unit‐specific weights that arise from a model for the assignment mechanism. We show how to construct these weights in various settings, including the staggered adoption setting, where units opt into the treatment sequentially but permanently. The resulting estimator converges to an average (over units and time) treatment effect under the correct specification of the assignment model, even if the fixed‐ effect model is misspecified. We show that our estimator is more robust than the conventional two‐way estimator: it remains consistent if either the assignment mechanism or the two‐way regression model is correctly specified. In addition, the proposed estimator performs better than the two‐way‐fixed‐effect estimator if the outcome model and assignment mechanism are locally misspecified. This strong robustness property underlines and quantifies the benefits of modeling the assignment process and motivates using our estimator in practice. We also discuss an extension of our estimator to handle dynamic treatment effects. 
    more » « less
  2. Abstract Propensity score weighting is a tool for causal inference to adjust for measured confounders in observational studies. In practice, data often present complex structures, such as clustering, which make propensity score modeling and estimation challenging. In addition, for clustered data, there may be unmeasured cluster-level covariates that are related to both the treatment assignment and outcome. When such unmeasured cluster-specific confounders exist and are omitted in the propensity score model, the subsequent propensity score adjustment may be biased. In this article, we propose a calibration technique for propensity score estimation under the latent ignorable treatment assignment mechanism, i. e., the treatment-outcome relationship is unconfounded given the observed covariates and the latent cluster-specific confounders. We impose novel balance constraints which imply exact balance of the observed confounders and the unobserved cluster-level confounders between the treatment groups. We show that the proposed calibrated propensity score weighting estimator is doubly robust in that it is consistent for the average treatment effect if either the propensity score model is correctly specified or the outcome follows a linear mixed effects model. Moreover, the proposed weighting method can be combined with sampling weights for an integrated solution to handle confounding and sampling designs for causal inference with clustered survey data. In simulation studies, we show that the proposed estimator is superior to other competitors. We estimate the effect of School Body Mass Index Screening on prevalence of overweight and obesity for elementary schools in Pennsylvania. 
    more » « less
  3. Given a length n sample from R^d and a neural network with a fixed architecture with W weights, k neurons, linear threshold activation functions, and binary outputs on each neuron, we study the problem of uniformly sampling from all possible labelings on the sample corresponding to different choices of weights. We provide an algorithm that runs in time polynomial both in n and W such that any labeling appears with probability at least (W2ekn)^W for W < n. For a single neuron, we also provide a random walk based algorithm that samples exactly uniformly. 
    more » « less
  4. Abstract. Rejuvenation in particle filters is necessary to prevent the collapse of the weights when the number of particles is insufficient to properly sample the high-probability regions of the state space. Rejuvenation is often implemented in a heuristic manner by the addition of random noise that widens the support of the ensemble. This work aims at improving canonical rejuvenation methodology by the introduction of additional prior information obtained from climatological samples; the dynamical particles used for importance sampling are augmented with samples obtained from stochastic covariance shrinkage. A localized variant of the proposed method is developed.Numerical experiments with the Lorenz '63 model show that modified filters significantly improve the analyses for low dynamical ensemble sizes. Furthermore, localization experiments with the Lorenz '96 model show that the proposed methodology is extendable to larger systems. 
    more » « less
  5. We consider the problem of estimating signal attacks injected into the actuators or sensors of control systems, assuming the attack is detectable, i.e., it can be seen at the output. We show that there exists a trade-off between attack rejection and control, and that the estimator design depends on the controller used. We use dual rate sampling to enhance detectability of the attacks and we provide different methods to design the estimator. The first method is by solving a model matching problem subject to causality constraints. The second method exploits dual rate sampling to accurately reconstruct the unknown input. The third method is using a dual rate unknown input observer.We provide conditions on the existence of these estimators, and show that dual rate unknown input observers always exist if the multirate system does not have a zero at 1. 
    more » « less