skip to main content


Search for: All records

Award ID contains: 1918692

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Approximating probability distributions can be a challenging task, particularly when they are supported over regions of high geometrical complexity or exhibit multiple modes. Annealing can be used to facilitate this task which is often combined with constant a priori selected increments in inverse temperature. However, using constant increments limits the computational efficiency due to the inability to adapt to situations where smooth changes in the annealed density could be handled equally well with larger increments. We introduce AdaAnn, an adaptive annealing scheduler that automatically adjusts the temperature increments based on the expected change in the Kullback-Leibler divergence between two distributions with a sufficiently close annealing temperature. AdaAnn is easy to implement and can be integrated into existing sampling approaches such as normalizing flows for variational inference and Markov chain Monte Carlo. We demonstrate the computational efficiency of the AdaAnn scheduler for variational inference with normalizing flows on a number of examples, including posterior estimation of parameters for dynamical systems and probability density approximation in multimodal and high-dimensional settings. 
    more » « less
  2. Fast inference of numerical model parameters from data is an important prerequisite to generate predictive models for a wide range of applications. Use of sampling-based approaches such as Markov chain Monte Carlo may become intractable when each likelihood evaluation is computationally expensive. New approaches combining variational inference with normalizing flow are characterized by a computational cost that grows only linearly with the dimensionality of the latent variable space, and rely on gradient-based optimization instead of sampling, providing a more efficient approach for Bayesian inference about the model parameters. Moreover, the cost of frequently evaluating an expensive likelihood can be mitigated by replacing the true model with an offline trained surrogate model, such as neural networks. However, this approach might generate significant bias when the surrogate is insufficiently accurate around the posterior modes. To reduce the computational cost without sacrificing inferential accuracy, we propose Normalizing Flow with Adaptive Surrogate (NoFAS), an optimization strategy that alternatively updates the normalizing flow parameters and surrogate model parameters. We also propose an efficient sample weighting scheme for surrogate model training that preserves global accuracy while effectively capturing high posterior density regions. We demonstrate the inferential and computational superiority of NoFAS against various benchmarks, including cases where the underlying model lacks identifiability. The source code and numerical experiments used for this study are available at https://github.com/cedricwangyu/NoFAS. 
    more » « less
  3. null (Ed.)
    Diastolic dysfunction is a common pathology occurring in about one third of patients affected by heart failure. This condition may not be associated with a marked decrease in cardiac output or systemic pressure and therefore is more difficult to diagnose than its systolic counterpart. Compromised relaxation or increased stiffness of the left ventricle induces an increase in the upstream pulmonary pressures, and is classified as secondary or group II pulmonary hypertension (2018 Nice classification). This may result in an increase in the right ventricular afterload leading to right ventricular failure. Elevated pulmonary pressures are therefore an important clinical indicator of diastolic heart failure (sometimes referred to as heart failure with preserved ejection fraction, HFpEF), showing significant correlation with associated mortality. However, accurate measurements of this quantity are typically obtained through invasive catheterization and after the onset of symptoms. In this study, we use the hemodynamic consistency of a differential-algebraic circulation model to predict pulmonary pressures in adult patients from other, possibly non-invasive, clinical data. We investigate several aspects of the problem, including the ability of model outputs to represent a sufficiently wide pathologic spectrum, the identifiability of the model's parameters, and the accuracy of the predicted pulmonary pressures. We also find that a classifier using the assimilated model parameters as features is free from the problem of missing data and is able to detect pulmonary hypertension with sufficiently high accuracy. For a cohort of 82 patients suffering from various degrees of heart failure severity, we show that systolic, diastolic, and wedge pulmonary pressures can be estimated on average within 8, 6, and 6 mmHg, respectively. We also show that, in general, increased data availability leads to improved predictions. 
    more » « less