skip to main content


Title: Revealing Continuous Brain Dynamical Organization with Multimodal Graph Transformer
Brain large-scale dynamics is constrained by the heterogeneity of intrinsic anatomical substrate. Little is known how the spatiotemporal dynamics adapt for the heterogeneous structural connectivity (SC). Modern neuroimaging modalities make it possible to study the intrinsic brain activity at the scale of seconds to minutes. Diffusion magnetic resonance imaging (dMRI) and functional MRI reveals the large-scale SC across different brain regions. Electrophysiological methods (i.e. MEG/EEG) provide direct measures of neural activity and exhibits complex neurobiological temporal dynamics which could not be solved by fMRI. However, most of existing multimodal analytical methods collapse the brain measurements either in space or time domain and fail to capture the spatio-temporal circuit dynamics. In this paper, we propose a novel spatio-temporal graph Transformer model to integrate the structural and functional connectivity in both spatial and temporal domain. The proposed method learns the heterogeneous node and graph representation via contrastive learning and multi-head attention based graph Transformer using multimodal brain data (i.e. fMRI, MRI, MEG and behavior performance). The proposed contrastive graph Transformer representation model incorporates the heterogeneity map constrained by T1-to-T2-weighted (T1w/T2w) to improve the model fit to structurefunction interactions. The experimental results with multimodal resting state brain measurements demonstrate the proposed method could highlight the local properties of large-scale brain spatio-temporal dynamics and capture the dependence strength between functional connectivity and behaviors. In summary, the proposed method enables the complex brain dynamics explanation for different modal variants.  more » « less
Award ID(s):
2045848 1837956
NSF-PAR ID:
10359049
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2022. Lecture Notes in Computer Science.
Volume:
13431
Page Range / eLocation ID:
346–355
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Understanding the intrinsic patterns of human brain is important to make inferences about the mind and brain-behavior association. Electrophysiological methods (i.e. MEG/EEG) provide direct measures of neural activity without the effect of vascular confounds. The blood oxygenated level-dependent (BOLD) signal of functional MRI (fMRI) reveals the spatial and temporal brain activity across different brain regions. However, it is unclear how to associate the high temporal resolution Electrophysiological measures with high spatial resolution fMRI signals. Here, we present a novel interpretable model for coupling the structure and function activity of brain based on heterogeneous contrastive graph representation. The proposed method is able to link manifest variables of the brain (i.e. MEG, MRI, fMRI and behavior performance) and quantify the intrinsic coupling strength of different modal signals. The proposed method learns the heterogeneous node and graph representations by contrasting the structural and temporal views through the mind to multimodal brain data. The first experiment with 1200 subjects from Human connectome Project (HCP) shows that the proposed method outperforms the existing approaches in predicting individual gender and enabling the location of the importance of brain regions with sex difference. The second experiment associates the structure and temporal views between the low-level sensory regions and high-level cognitive ones. The experimental results demonstrate that the dependence of structural and temporal views varied spatially through different modal variants. The proposed method enables the heterogeneous biomarkers explanation for different brain measurements. 
    more » « less
  2. Multimodal evidence suggests that brain regions accumulate information over timescales that vary according to anatomical hierarchy. Thus, these experimentally defined “temporal receptive windows” are longest in cortical regions that are distant from sensory input. Interestingly, spontaneous activity in these regions also plays out over relatively slow timescales (i.e., exhibits slower temporal autocorrelation decay). These findings raise the possibility that hierarchical timescales represent an intrinsic organizing principle of brain function. Here, using resting-state functional MRI, we show that the timescale of ongoing dynamics follows hierarchical spatial gradients throughout human cerebral cortex. These intrinsic timescale gradients give rise to systematic frequency differences among large-scale cortical networks and predict individual-specific features of functional connectivity. Whole-brain coverage permitted us to further investigate the large-scale organization of subcortical dynamics. We show that cortical timescale gradients are topographically mirrored in striatum, thalamus, and cerebellum. Finally, timescales in the hippocampus followed a posterior-to-anterior gradient, corresponding to the longitudinal axis of increasing representational scale. Thus, hierarchical dynamics emerge as a global organizing principle of mammalian brains.

     
    more » « less
  3. Abstract

    There is growing evidence that rather than using a single brain imaging modality to study its association with physiological or symptomatic features, the field is paying more attention to fusion of multimodal information. However, most current multimodal fusion approaches that incorporate functional magnetic resonance imaging (fMRI) are restricted to second‐level 3D features, rather than the original 4D fMRI data. This trade‐off is that the valuable temporal information is not utilized during the fusion step. Here we are motivated to propose a novel approach called “parallel group ICA+ICA” that incorporates temporal fMRI information from group independent component analysis (GICA) into a parallel independent component analysis (ICA) framework, aiming to enable direct fusion of first‐level fMRI features with other modalities (e.g., structural MRI), which thus can detect linked functional network variability and structural covariations. Simulation results show that the proposed method yields accurate intermodality linkage detection regardless of whether it is strong or weak. When applied to real data, we identified one pair of significantly associated fMRI‐sMRI components that show group difference between schizophrenia and controls in both modalities, and this linkage can be replicated in an independent cohort. Finally, multiple cognitive domain scores can be predicted by the features identified in the linked component pair by our proposed method. We also show these multimodal brain features can predict multiple cognitive scores in an independent cohort. Overall, results demonstrate the ability of parallel GICA+ICA to estimate joint information from 4D and 3D data without discarding much of the available information up front, and the potential for using this approach to identify imaging biomarkers to study brain disorders.

     
    more » « less
  4. Abstract

    In this article, we focus on estimating the joint relationship between structural magnetic resonance imaging (sMRI) gray matter (GM), and multiple functional MRI (fMRI) intrinsic connectivity networks (ICNs). To achieve this, we propose a multilink joint independent component analysis (ml‐jICA) method using the same core algorithm as jICA. To relax the jICA assumption, we propose another extension called parallel multilink jICA (pml‐jICA) that allows for a more balanced weight distribution over ml‐jICA/jICA. We assume a shared mixing matrix for both the sMRI and fMRI modalities, while allowing for different mixing matrices linking the sMRI data to the different ICNs. We introduce the model and then apply this approach to study the differences in resting fMRI and sMRI data from patients with Alzheimer's disease (AD) versus controls. The results of the pml‐jICA yield significant differences with large effect sizes that include regions in overlapping portions of default mode network, and also hippocampus and thalamus. Importantly, we identify two joint components with partially overlapping regions which show opposite effects for AD versus controls, but were able to be separated due to being linked to distinct functional and structural patterns. This highlights the unique strength of our approach and multimodal fusion approaches generally in revealing potentially biomarkers of brain disorders that would likely be missed by a unimodal approach. These results represent the first work linking multiple fMRI ICNs to GM components within a multimodal data fusion model and challenges the typical view that brain structure is more sensitive to AD than fMRI.

     
    more » « less
  5. The link between mind, brain, and behavior has mystified philosophers and scientists for millennia. Recent progress has been made by forming statistical associations between manifest variables of the brain (e.g., electroencephalogram [EEG], functional MRI [fMRI]) and manifest variables of behavior (e.g., response times, accuracy) through hierarchical latent variable models. Within this framework, one can make inferences about the mind in a statistically principled way, such that complex patterns of brain–behavior associations drive the inference procedure. However, previous approaches were limited in the flexibility of the linking function, which has proved prohibitive for understanding the complex dynamics exhibited by the brain. In this article, we propose a data-driven, nonparametric approach that allows complex linking functions to emerge from fitting a hierarchical latent representation of the mind to multivariate, multimodal data. Furthermore, to enforce biological plausibility, we impose both spatial and temporal structure so that the types of realizable system dynamics are constrained. To illustrate the benefits of our approach, we investigate the model’s performance in a simulation study and apply it to experimental data. In the simulation study, we verify that the model can be accurately fitted to simulated data, and latent dynamics can be well recovered. In an experimental application, we simultaneously fit the model to fMRI and behavioral data from a continuous motion tracking task. We show that the model accurately recovers both neural and behavioral data and reveals interesting latent cognitive dynamics, the topology of which can be contrasted with several aspects of the experiment.

     
    more » « less