skip to main content


Title: An inner‐outer subcycling algorithm for parallel cardiac electrophysiology simulations
Abstract

This paper explores cardiac electrophysiological simulations of the monodomain equations and introduces a novel subcycling time integration algorithm to exploit the structure of the ionic model. The aim of this work is to improve upon the efficiency of parallel cardiac monodomain simulations by using our subcycling algorithm in the computation of the ionic model to handle the local sharp changes of the solution. This will reduce the turnaround time for the simulation of basic cardiac electrical function on both idealized and patient‐specific geometry. Numerical experiments show that the proposed approach is accurate and also has close to linear parallel scalability on a computer with more than 1000 processor cores. Ultimately, the reduction in simulation time can be beneficial in clinical applications, where multiple simulations are often required to tune a model to match clinical measurements.

 
more » « less
PAR ID:
10401706
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
International Journal for Numerical Methods in Biomedical Engineering
Volume:
39
Issue:
3
ISSN:
2040-7939
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Computer simulations are widely used to design and evaluate air traffic systems. A fast time simulation capability is essential to effectively explore the consequences of decisions in airspace design, air traffic management, and operations. A parallel simulation approach is proposed to accelerate fast time simulation of air traffic networks that exploits both temporal and spatial parallelisms. A time-parallel algorithm is first described that simulates different time intervals concurrently and uses a fix up computation that exploits the scheduled nature of commercial air traffic to address the problem of dependencies between time segments. The time-parallel algorithm is then extended with a space-parallel simulation approach using Time Warp to simulate each time segment in parallel thereby increasing the amount of parallelism that can be exploited. The time and space-parallel algorithms are evaluated using a simulation of the U.S. National Airspace System (NAS). Experimental data is presented demonstrating that this approach can achieve greater acceleration than what can be achieved by exploiting time-parallel or space-parallel simulation techniques alone.

     
    more » « less
  2. Patient-specific cardiac models are now being used to guide therapies. The increased use of patient-specific cardiac simulations in clinical care will give rise to the development of virtual cohorts of cardiac models. These cohorts will allow cardiac simulations to capture and quantify inter-patient variability. However, the development of virtual cohorts of cardiac models will require the transformation of cardiac modelling from small numbers of bespoke models to robust and rapid workflows that can create large numbers of models. In this review, we describe the state of the art in virtual cohorts of cardiac models, the process of creating virtual cohorts of cardiac models, and how to generate the individual cohort member models, followed by a discussion of the potential and future applications of virtual cohorts of cardiac models. This article is part of the theme issue ‘Uncertainty quantification in cardiac and cardiovascular modelling and simulation’. 
    more » « less
  3. Abstract

    This study presents a particle filter based framework to track cardiac surface from a time sequence of single magnetic resonance imaging (MRI) slices with the future goal of utilizing the presented framework for interventional cardiovascular magnetic resonance procedures, which rely on the accurate and online tracking of the cardiac surface from MRI data. The framework exploits a low-order parametric deformable model of the cardiac surface. A stochastic dynamic system represents the cardiac surface motion. Deformable models are employed to introduce shape prior to control the degree of the deformations. Adaptive filters are used to model complex cardiac motion in the dynamic model of the system. Particle filters are utilized to recursively estimate the current state of the system over time. The proposed method is applied to recover biventricular deformations and validated with a numerical phantom and multiple real cardiac MRI datasets. The algorithm is evaluated with multiple experiments using fixed and varying image slice planes at each time step. For the real cardiac MRI datasets, the average root-mean-square tracking errors of 2.61 mm and 3.42 mm are reported respectively for the fixed and varying image slice planes. This work serves as a proof-of-concept study for modeling and tracking the cardiac surface deformations via a low-order probabilistic model with the future goal of utilizing this method for the targeted interventional cardiac procedures under MR image guidance. For the real cardiac MRI datasets, the presented method was able to track the points-of-interests located on different sections of the cardiac surface within a precision of 3 pixels. The analyses show that the use of deformable cardiac surface tracking algorithm can pave the way for performing precise targeted intracardiac ablation procedures under MRI guidance. The main contributions of this work are twofold. First, it presents a framework for the tracking of whole cardiac surface from a time sequence of single image slices. Second, it employs adaptive filters to incorporate motion information in the tracking of nonrigid cardiac surface motion for temporal coherence.

     
    more » « less
  4. Summary

    Nowadays, we have entered the era of big data. In the area of high performance computing, large‐scale simulations can generate huge amounts of data with potentially critical information. However, these data are usually saved in intermediate files and are not instantly visible until advanced data analytics techniques are applied after reading all simulation data from persistent storages (eg, local disks or a parallel file system). This approach puts users in a situation where they spend long time on waiting for running simulations while not knowing the status of the running job. In this paper, we build a new computational framework to couple scientific simulations with multi‐step machine learning processes and in‐situ data visualizations. We also design a new scalable simulation‐time clustering algorithm to automatically detect fluid flow anomalies. This computational framework is built upon different software components and provides plug‐in data analysis and visualization functions over complex scientific workflows. With this advanced framework, users can monitor and get real‐time notifications of special patterns or anomalies from ongoing extreme‐scale turbulent flow simulations.

     
    more » « less
  5. Abstract

    Recent advances in random walk particle tracking have enabled direct simulation of mixing and reactions by allowing the particles to interact with each other using a multipoint mass transfer scheme. The mass transfer scheme allows separation of mixing and spreading processes, among other advantages, but it is computationally expensive because its speed depends on the number of interacting particle pairs. This note explores methods for relieving the computational bottleneck caused by the mass transfer step, and we use these algorithms to develop a new parallel, interacting particle model. The new model is a combination of a sparse search algorithm and a novel domain decomposition scheme, both of which offer significant speedup relative to the reference case—even when they are executed serially. We combine the strengths of these methods to create a parallel particle scheme that is highly accurate and efficient with run times that scale as 1/Pfor a fixed number of particles, wherePis the number of computational cores (equivalently, subdomains, in this work) being used. The new parallel model is a significant advance because it enables efficient simulation of large particle ensembles that are needed for environmental simulations and also because it can naturally pair with parallel geochemical solvers to create a practical Lagrangian tool for simulating mixing and reactions in complex chemical systems.

     
    more » « less