Abstract When implementing Markov Chain Monte Carlo (MCMC) algorithms, perturbation caused by numerical errors is sometimes inevitable. This paper studies how the perturbation of MCMC affects the convergence speed and approximation accuracy. Our results show that when the original Markov chain converges to stationarity fast enough and the perturbed transition kernel is a good approximation to the original transition kernel, the corresponding perturbed sampler has fast convergence speed and high approximation accuracy as well. Our convergence analysis is conducted under either the Wasserstein metric or the$$\chi^2$$metric, both are widely used in the literature. The results can be extended to obtain non-asymptotic error bounds for MCMC estimators. We demonstrate how to apply our convergence and approximation results to the analysis of specific sampling algorithms, including Random walk Metropolis, Metropolis adjusted Langevin algorithm with perturbed target densities, and parallel tempering Monte Carlo with perturbed densities. Finally, we present some simple numerical examples to verify our theoretical claims.
more »
« less
Malliavin-Based Multilevel Monte Carlo Estimators for Densities of Max-Stable Processes
We introduce a class of unbiased Monte Carlo estimators for multivariate densities of max-stable fields generated by Gaussian processes. Our estimators take advantage of recent results on the exact simulation of max-stable fields combined with identities studied in the Malliavin calculus literature and ideas developed in the multilevel Monte Carlo literature. Our approach allows estimating multivariate densities of max-stable fields with precision eps at a computational cost of order O(eps{−2}*logloglog(1/eps)).
more »
« less
- Award ID(s):
- 1838576
- PAR ID:
- 10074255
- Date Published:
- Journal Name:
- Monte Carlo and Quasi-Monte Carlo Methods 2016
- Page Range / eLocation ID:
- 75-97
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Computational models of the cardiovascular system are increasingly used for the diagnosis, treatment, and prevention of cardiovascular disease. Before being used for translational applications, the predictive abilities of these models need to be thoroughly demonstrated through verification, validation, and uncertainty quantification. When results depend on multiple uncertain inputs, sensitivity analysis is typically the first step required to separate relevant from unimportant inputs, and is key to determine an initial reduction on the problem dimensionality that will significantly affect the cost of all downstream analysis tasks. For computationally expensive models with numerous uncertain inputs, sample‐based sensitivity analysis may become impractical due to the substantial number of model evaluations it typically necessitates. To overcome this limitation, we consider recently proposed Multifidelity Monte Carlo estimators for Sobol’ sensitivity indices, and demonstrate their applicability to an idealized model of the common carotid artery. Variance reduction is achieved combining a small number of three‐dimensional fluid–structure interaction simulations with affordable one‐ and zero‐dimensional reduced‐order models. These multifidelity Monte Carlo estimators are compared with traditional Monte Carlo and polynomial chaos expansion estimates. Specifically, we show consistent sensitivity ranks for both bi‐ (1D/0D) and tri‐fidelity (3D/1D/0D) estimators, and superior variance reduction compared to traditional single‐fidelity Monte Carlo estimators for the same computational budget. As the computational burden of Monte Carlo estimators for Sobol’ indices is significantly affected by the problem dimensionality, polynomial chaos expansion is found to have lower computational cost for idealized models with smooth stochastic response.more » « less
-
This article develops a Markov chain Monte Carlo (MCMC) method for a class of models that encompasses finite and countable mixtures of densities and mixtures of experts with a variable number of mixture components. The method is shown to maximize the expected probability of acceptance for cross-dimensional moves and to minimize the asymptotic variance of sample average estimators under certain restrictions. The method can be represented as a retrospective sampling algorithm with an optimal choice of auxiliary priors and as a reversible jump algorithm with optimal proposal distributions. The method is primarily motivated by and applied to a Bayesian nonparametric model for conditional densities based on mixtures of a variable number of experts. The mixture of experts model outperforms standard parametric and nonparametric alternatives in out of sample performance comparisons in an application to Engel curve estimation. The proposed MCMC algorithm makes estimation of this model practical.more » « less
-
We estimate the parameter of a stationary time series process by minimizing the integrated weighted mean squared error between the empirical and simulated characteristic function, when the true characteristic functions cannot be explicitly computed. Motivated by Indirect Inference, we use a Monte Carlo approximation of the characteristic function based on i.i.d. simulated blocks. As a classical variance reduction technique, we propose the use of control variates for reducing the variance of this Monte Carlo approximation. These two approximations yield two new estimators that are applicable to a large class of time series processes. We show consistency and asymptotic normality of the parameter estimators under strong mixing, moment conditions, and smoothness of the simulated blocks with respect to its parameter. In a simulation study we show the good performance of these new simulation based estimators, and the superiority of the control variates based estimator for Poisson driven time series of counts.more » « less
-
We present a novel technique for tailoring Bayesian quadrature (BQ) to model selection. The state-of-the-art for comparing the evidence of multiple models relies on Monte Carlo methods, which converge slowly and are unreliable for computationally expensive models. Although previous research has shown that BQ offers sample efficiency superior to Monte Carlo in computing the evidence of an individual model, applying BQ directly to model comparison may waste computation producing an overly-accurate estimate for the evidence of a clearly poor model. We propose an automated and efficient algorithm for computing the most-relevant quantity for model selection: the posterior model probability. Our technique maximizes the mutual information between this quantity and observations of the models’ likelihoods, yielding efficient sample acquisition across disparate model spaces when likelihood observations are limited. Our method produces more-accurate posterior estimates using fewer likelihood evaluations than standard Bayesian quadrature and Monte Carlo estimators, as we demonstrate on synthetic and real-world examples.more » « less
An official website of the United States government

