Data-driven Deep Learning (DL) models have revolutionized autonomous systems, but ensuring their safety and reliability necessitates the assessment of predictive confidence or uncertainty. Bayesian DL provides a principled approach to quantify uncertainty via probability density functions defined over model parameters. However, the exact solution is intractable for most DL models, and the approximation methods, often based on heuristics, suffer from scalability issues and stringent distribution assumptions and may lack theoretical guarantees. This work develops a Sequential Importance Sampling framework that approximates the posterior probability density function through weighted samples (or particles), which can be used to find the mean, variance, or higher-order moments of the posterior distribution. We demonstrate that propagating particles, which capture information about the higher-order moments, through the layers of the DL model results in increased robustness to natural and malicious noise (adversarial attacks). The variance computed from these particles effectively quantifies the model’s decision uncertainty, demonstrating well-calibrated and accurate predictive confidence.
more »
« less
SVAR Identification from Higher Moments: Has the Simultaneous Causality Problem Been Solved?
Two recent strands of the structural vector autoregression literature use higher moments for identification, exploiting either non-Gaussianity or heteroskedasticity. These approaches achieve point identification without exclusion or sign restrictions. We review this work critically and contrast its goals with the separate research program that has pushed for macroeconometrics to rely more heavily on credible economic restrictions. Identification from higher moments imposes stronger assumptions on the shock process than second-order methods do. We recommend that these assumptions be tested. Since inference from higher moments places high demands on a finite sample, weak identification issues should be given priority by applied users.
more »
« less
- Award ID(s):
- 1851665
- PAR ID:
- 10343525
- Date Published:
- Journal Name:
- AEA Papers and Proceedings
- Volume:
- 112
- ISSN:
- 2574-0768
- Page Range / eLocation ID:
- 481 to 485
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Quantum reference frames are expected to differ from classical reference frames because they have to implement typical quantum features such as fluctuations and correlations. Here, we show that fluctuations and correlations of reference variables, in particular of time, are restricted by their very nature of being used for reference. Mathematically, this property is implemented by imposing constraints on the system to make sure that reference variables are not physical degrees of freedom. These constraints not only relate physical degrees of freedom to reference variables in order to describe their behavior, they also restrict quantum fluctuations of reference variables and their correlations with system degrees of freedom. We introduce the notion of “almost-positive” states as a suitable mathematical method. An explicit application of their properties to examples of recent interest in quantum reference frames reveals previously unrecognized restrictions on possible frame–system interactions. While currently discussed clock models rely on assumptions that, as shown here, make them consistent as quantum reference frames, relaxing these assumptions will expose the models to new restrictions that appear to be rather strong. Almost-positive states also shed some light on a recent debate about the consistency of relational quantum mechanics.more » « less
-
The literature on stochastic programming typically restricts attention to problems that fulfill constraint qualifications. The literature on estimation and inference under partial identification frequently restricts the geometry of identified sets with diverse high-level assumptions. These superficially appear to be different approaches to closely related problems. We extensively analyze their relation. Among other things, we show that for partial identification through pure moment inequalities, numerous assumptions from the literature essentially coincide with the Mangasarian–Fromowitz constraint qualification. This clarifies the relation between well-known contributions, including within econometrics, and elucidates stringency, as well as ease of verification, of some high-level assumptions in seminal papers.more » « less
-
Shape restrictions have played a central role in economics as both testable implications of theory and sufficient conditions for obtaining informative counterfactual predictions. In this paper, we provide a general procedure for inference under shape restrictions in identified and partially identified models defined by conditional moment restrictions. Our test statistics and proposed inference methods are based on the minimum of the generalized method of moments (GMM) objective function with and without shape restrictions. Uniformly valid critical values are obtained through a bootstrap procedure that approximates a subset of the true local parameter space. In an empirical analysis of the effect of childbearing on female labor supply, we show that employing shape restrictions in linear instrumental variables (IV) models can lead to shorter confidence regions for both local and average treatment effects. Other applications we discuss include inference for the variability of quantile IV treatment effects and for bounds on average equivalent variation in a demand model with general heterogeneity.more » « less
-
null (Ed.)Many physical tasks such as pulling out a drawer or wiping a table can be modeled with geometric constraints. These geometric constraints are characterized by restrictions on kinematic trajectories and reaction wrenches (forces and moments) of objects under the influence of the constraint. This paper presents a method to infer geometric constraints involving unmodeled objects in human demonstrations using both kinematic and wrench measurements. Our approach takes a recording of a human demonstration and determines what constraints are present, when they occur, and their parameters (e.g. positions). By using both kinematic and wrench information, our methods are able to reliably identify a variety of constraint types, even if the constraints only exist for short durations within the demonstration. We present a systematic approach to fitting arbitrary scleronomic constraint models to kinematic and wrench measurements. Reaction forces are estimated from measurements by removing friction. Position, orientation, force, and moment error metrics are developed to provide systematic comparison between constraint models. By conducting a user study, we show that our methods can reliably identify constraints in realistic situations and confirm the value of including forces and moments in the model regression and selection process.more » « less
An official website of the United States government

