%AGordon, S.L.%AKumar, V.%ASchulman, L.%ASrivastava, P.%Ade Campos, C. Ed.%AMaathuis, M. Ed.%BJournal Name: Proceedings of Machine Learning Research; Journal Volume: 161
%D2021%I
%JJournal Name: Proceedings of Machine Learning Research; Journal Volume: 161
%K
%MOSTI ID: 10377068
%PMedium: X; Size: 1948-1957
%TCondition number bounds for causal inference
%XAn important achievement in the field of causal inference was a complete characterization of when a causal effect, in a system modeled by a causal graph, can be determined uniquely from purely observational data. The identification algorithms resulting from this work produce exact symbolic expressions for causal effects, in terms of the observational probabilities. More recent work has looked at the numerical properties of these expressions, in particular using the classical notion of the condition number. In its classical interpretation, the condition number quantifies the sensitivity of the output values of the expressions to small numerical perturbations in the input observational probabilities. In the context of causal identification, the condition number has also been shown to be related to the effect of certain kinds of uncertainties in the structure of the causal graphical model. In this paper, we first give an upper bound on the condition number for the interesting case of causal graphical models with small “confounded components”. We then develop a tight characterization of the condition number of any given causal identification problem. Finally, we use our tight characterization to give a specific example where the condition number can be much lower than that obtained via generic bounds on the condition number, and to show that even “equivalent” expressions for causal identification can behave very differently with respect to their numerical stability properties.
%0Journal Article