skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Indirect inference for time series using the empirical characteristic function and control variates
We estimate the parameter of a stationary time series process by minimizing the integrated weighted mean squared error between the empirical and simulated characteristic function, when the true characteristic functions cannot be explicitly computed. Motivated by Indirect Inference, we use a Monte Carlo approximation of the characteristic function based on i.i.d. simulated blocks. As a classical variance reduction technique, we propose the use of control variates for reducing the variance of this Monte Carlo approximation. These two approximations yield two new estimators that are applicable to a large class of time series processes. We show consistency and asymptotic normality of the parameter estimators under strong mixing, moment conditions, and smoothness of the simulated blocks with respect to its parameter. In a simulation study we show the good performance of these new simulation based estimators, and the superiority of the control variates based estimator for Poisson driven time series of counts.  more » « less
Award ID(s):
2015379
PAR ID:
10359854
Author(s) / Creator(s):
 ;  ;  
Publisher / Repository:
Wiley-Blackwell
Date Published:
Journal Name:
Journal of Time Series Analysis
Volume:
42
Issue:
5-6
ISSN:
0143-9782
Page Range / eLocation ID:
p. 653-684
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. “[A]llain Gersten, Hopfen, und Wasser” — 1516 Reinheitsgebot We present Bavarian , a collection of sampling-based algorithms for approximating the Betweenness Centrality (BC) of all vertices in a graph. Our algorithms use Monte-Carlo Empirical Rademacher Averages (MCERAs), a concept from statistical learning theory, to efficiently compute tight bounds on the maximum deviation of the estimates from the exact values. The MCERAs provide a sample-dependent approximation guarantee much stronger than the state-of-the-art, thanks to its use of variance-aware probabilistic tail bounds. The flexibility of the MCERAs allows us to introduce a unifying framework that can be instantiated with existing sampling-based estimators of BC, thus allowing a fair comparison between them, decoupled from the sample-complexity results with which they were originally introduced. Additionally, we prove novel sample-complexity results showing that, for all estimators, the sample size sufficient to achieve a desired approximation guarantee depends on the vertex-diameter of the graph, an easy-to-bound characteristic quantity. We also show progressive-sampling algorithms and extensions to other centrality measures, such as percolation centrality. Our extensive experimental evaluation of Bavarian shows the improvement over the state-of-the-art made possible by the MCERAs (2–4× reduction in the error bound), and it allows us to assess the different trade-offs between sample size and accuracy guarantees offered by the different estimators. 
    more » « less
  2. Abstract Computational models of the cardiovascular system are increasingly used for the diagnosis, treatment, and prevention of cardiovascular disease. Before being used for translational applications, the predictive abilities of these models need to be thoroughly demonstrated through verification, validation, and uncertainty quantification. When results depend on multiple uncertain inputs, sensitivity analysis is typically the first step required to separate relevant from unimportant inputs, and is key to determine an initial reduction on the problem dimensionality that will significantly affect the cost of all downstream analysis tasks. For computationally expensive models with numerous uncertain inputs, sample‐based sensitivity analysis may become impractical due to the substantial number of model evaluations it typically necessitates. To overcome this limitation, we consider recently proposed Multifidelity Monte Carlo estimators for Sobol’ sensitivity indices, and demonstrate their applicability to an idealized model of the common carotid artery. Variance reduction is achieved combining a small number of three‐dimensional fluid–structure interaction simulations with affordable one‐ and zero‐dimensional reduced‐order models. These multifidelity Monte Carlo estimators are compared with traditional Monte Carlo and polynomial chaos expansion estimates. Specifically, we show consistent sensitivity ranks for both bi‐ (1D/0D) and tri‐fidelity (3D/1D/0D) estimators, and superior variance reduction compared to traditional single‐fidelity Monte Carlo estimators for the same computational budget. As the computational burden of Monte Carlo estimators for Sobol’ indices is significantly affected by the problem dimensionality, polynomial chaos expansion is found to have lower computational cost for idealized models with smooth stochastic response. 
    more » « less
  3. Jasra, Ajay (Ed.)
    Variational Bayesian (VB) methods produce posterior inference in a time frame considerably smaller than traditional Markov Chain Monte Carlo approaches. Although the VB posterior is an approximation, it has been shown to produce good parameter estimates and predicted values when a rich classes of approximating distributions are considered. In this paper, we propose the use of recursive algorithms to update a sequence of VB posterior approximations in an online, time series setting, with the computation of each posterior update requiring only the data observed since the previous update. We show how importance sampling can be incorporated into online variational inference allowing the user to trade accuracy for a substantial increase in computational speed. The proposed methods and their properties are detailed in two separate simulation studies. Additionally, two empirical illustrations are provided, including one where a Dirichlet Process Mixture model with a novel posterior dependence structure is repeatedly updated in the context of predicting the future behaviour of vehicles on a stretch of the US Highway 101. 
    more » « less
  4. We consider the numerical analysis of the inchworm Monte Carlo method, which is proposed recently to tackle the numerical sign problem for open quantum systems. We focus on the growth of the numerical error with respect to the simulation time, for which the inchworm Monte Carlo method shows a flatter curve than the direct application of Monte Carlo method to the classical Dyson series. To better understand the underlying mechanism of the inchworm Monte Carlo method, we distinguish two types of exponential error growth, which are known as the numerical sign problem and the error amplification. The former is due to the fast growth of variance in the stochastic method, which can be observed from the Dyson series, and the latter comes from the evolution of the numerical solution. Our analysis demonstrates that the technique of partial resummation can be considered as a tool to balance these two types of error, and the inchworm Monte Carlo method is a successful case where the numerical sign problem is effectively suppressed by such means. We first demonstrate our idea in the context of ordinary differential equations, and then provide complete analysis for the inchworm Monte Carlo method. Several numerical experiments are carried out to verify our theoretical results. 
    more » « less
  5. Normalizing flows (NFs) provide uncorrelated samples from complex distributions, making them an appealing tool for parameter estimation. However, the practical utility of NFs remains limited by their tendency to collapse to a single mode of a multimodal distribution. In this study, we show that annealing with an adaptive schedule based on the effective sample size (ESS) can mitigate mode collapse. We demonstrate that our approach can converge the marginal likelihood for a biochemical oscillator model fit to time-series data in ten-fold less computation time than a widely used ensemble Markov chain Monte Carlo (MCMC) method. We show that the ESS can also be used to reduce variance by pruning the samples. We expect these developments to be of general use for sampling with NFs and discuss potential opportunities for further improvements. 
    more » « less