skip to main content


Title: Adaptive smoothing based on Gaussian processes regression increases the sensitivity and specificity of fMRI data
Abstract

Temporal and spatial filtering of fMRI data is often used to improve statistical power. However, conventional methods, such as smoothing with fixed‐width Gaussian filters, remove fine‐scale structure in the data, necessitating a tradeoff between sensitivity and specificity. Specifically, smoothing may increase sensitivity (reduce noise and increase statistical power) but at the cost loss of specificity in that fine‐scale structure in neural activity patterns is lost. Here, we propose an alternative smoothing method based on Gaussian processes (GP) regression for single subjects fMRI experiments. This method adapts the level of smoothing on a voxel by voxel basis according to the characteristics of the local neural activity patterns. GP‐based fMRI analysis has been heretofore impractical owing to computational demands. Here, we demonstrate a new implementation of GP that makes it possible to handle the massive data dimensionality of the typical fMRI experiment. We demonstrate how GP can be used as a drop‐in replacement to conventional preprocessing steps for temporal and spatial smoothing in a standard fMRI pipeline. We present simulated and experimental results that show the increased sensitivity and specificity compared to conventional smoothing strategies.Hum Brain Mapp 38:1438–1459, 2017. © 2016 Wiley Periodicals, Inc.

 
more » « less
NSF-PAR ID:
10037072
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Human Brain Mapping
Volume:
38
Issue:
3
ISSN:
1065-9471
Format(s):
Medium: X Size: p. 1438-1459
Size(s):
["p. 1438-1459"]
Sponsoring Org:
National Science Foundation
More Like this
  1. Latent manifolds provide a compact characterization of neural population activity and of shared co-variability across brain areas. Nonetheless, existing statistical tools for extracting neural manifolds face limitations in terms of interpretability of latents with respect to task variables, and can be hard to apply to datasets with no trial repeats. Here we propose a novel probabilistic framework that allows for interpretable partitioning of population variability within and across areas in the context of naturalistic behavior. Our approach for task aligned manifold estimation (TAME-GP) extends a probabilistic variant of demixed PCA by (1) explicitly partitioning variability into private and shared sources, (2) using a Poisson noise model, and (3) introducing temporal smoothing of latent trajectories in the form of a Gaussian Process prior. This TAME-GP graphical model allows for robust estimation of task-relevant variability in local population responses, and of shared co-variability between brain areas. We demonstrate the efficiency of our estimator on within model and biologically motivated simulated data. We also apply it to neural recordings in a closed-loop virtual navigation task in monkeys, demonstrating the capacity of TAME-GP to capture meaningful intra- and inter-area neural variability with single trial resolution. 
    more » « less
  2. Understanding the intrinsic patterns of human brain is important to make inferences about the mind and brain-behavior association. Electrophysiological methods (i.e. MEG/EEG) provide direct measures of neural activity without the effect of vascular confounds. The blood oxygenated level-dependent (BOLD) signal of functional MRI (fMRI) reveals the spatial and temporal brain activity across different brain regions. However, it is unclear how to associate the high temporal resolution Electrophysiological measures with high spatial resolution fMRI signals. Here, we present a novel interpretable model for coupling the structure and function activity of brain based on heterogeneous contrastive graph representation. The proposed method is able to link manifest variables of the brain (i.e. MEG, MRI, fMRI and behavior performance) and quantify the intrinsic coupling strength of different modal signals. The proposed method learns the heterogeneous node and graph representations by contrasting the structural and temporal views through the mind to multimodal brain data. The first experiment with 1200 subjects from Human connectome Project (HCP) shows that the proposed method outperforms the existing approaches in predicting individual gender and enabling the location of the importance of brain regions with sex difference. The second experiment associates the structure and temporal views between the low-level sensory regions and high-level cognitive ones. The experimental results demonstrate that the dependence of structural and temporal views varied spatially through different modal variants. The proposed method enables the heterogeneous biomarkers explanation for different brain measurements. 
    more » « less
  3. Background: Researchers have recently started to validate decades-old program-comprehension models using functional magnetic resonance imaging (fMRI). While fMRI helps us to understand neural correlates of cognitive processes during program comprehension, its comparatively low temporal resolution (i.e., seconds) cannot capture fast cognitive subprocesses (i.e., milliseconds). Aims: To increase the explanatory power of fMRI measurement of programmers, we are exploring in this methodological paper the feasibility of adding simultaneous eye tracking to fMRI measurement. By developing a method to observe programmers with two complementary measures, we aim at obtaining a more comprehensive understanding of program comprehension. Method: We conducted a controlled fMRI experiment of 22 student participants with simultaneous eye tracking. Results: We have been able to successfully capture fMRI and eye-tracking data, although with limitations regarding partial data loss and spatial imprecision. The biggest issue that we experienced is the partial loss of data: for only 10 participants, we could collect a complete set of high-precision eye-tracking data. Since some participants of fMRI studies show excessive head motion, the proportion of full and high-quality data on fMRI and eye tracking is rather low. Still, the remaining data allowed us to confirm our prior hypothesis of semantic recall during program comprehension, which was not possible with fMRI alone. Conclusions: Simultaneous measurement of program comprehension with fMRI and eye tracking is promising, but with limitations. By adding simultaneous eye tracking to our fMRI study framework, we can conduct more fine-grained fMRI analyses, which in turn helps us to understand programmer behavior better. 
    more » « less
  4. null (Ed.)
    Abstract. Satellite remote sensing provides a global view to processes on Earth that has unique benefits compared to making measurements on the ground, such as global coverage and enormous data volume. The typical downsides are spatial and temporal gaps and potentially low data quality. Meaningful statistical inference from such data requires overcoming these problems and developing efficient and robust computational tools.We design and implement a computationally efficient multi-scale Gaussian process (GP) software package, satGP, geared towards remote sensing applications. The software is able to handle problems of enormous sizes and to compute marginals and sample from the random field conditioning on at least hundreds of millions of observations. This is achieved by optimizing the computation by, e.g., randomization and splitting the problem into parallel local subproblems which aggressively discard uninformative data. We describe the mean function of the Gaussian process by approximating marginals of a Markov random field (MRF). Variability around the mean is modeled with a multi-scale covariance kernel, which consists of Matérn, exponential, and periodic components. We also demonstrate how winds can be used to inform covariances locally.The covariance kernel parameters are learned by calculating an approximate marginal maximum likelihood estimate, and the validity of both the multi-scale approach and the method used to learn the kernel parameters is verified in synthetic experiments. We apply these techniques to a moderate size ozone data set produced by an atmospheric chemistry model and to the very large number of observations retrieved from the Orbiting Carbon Observatory 2 (OCO-2) satellite. The satGP software is released under an open-source license. 
    more » « less
  5. Abstract

    A critical consideration when using molecular ecological methods to detect trends and parameterize models at very fine spatial and temporal scales has always been the technical limits of resolution. Key landscape features, including most anthropogenic modifications, can cause biologically important, but very recent changes in gene flow that require substantial statistical power to detect. The problem is one of temporal scale: Human change is rapid and recent, while genetic changes accumulate slowly. We generated SNPs from thousands of nuclear loci to characterize the population structure of New York‐endangered eastern tiger salamanders (Ambystoma tigrinum) on Long Island and quantify the impacts of roads on population fragmentation. In stark contrast to a recent microsatellite study, we uncovered highly structured populations over an extremely small spatial scale (approximately 40 km2) in an increasingly human‐modified landscape. Geographic distance and the presence of roads between ponds were both strong predictors of genetic divergence, suggesting that both natural and anthropogenic factors contribute to the observed patterns of genetic variation. All ponds supported small to modest effective breeding populations, and pond surface area showed a strong positive correlation with population size. None of these patterns emerged in an earlier study of the same system using microsatellite loci, and we determined that at least 300–400 SNPs were needed to recover the fine‐scale population structure present in this system. Conservation assessments using earlier genetic techniques in other species may similarly lack the statistical power for small‐scale inferences and benefit from reassessments using genomic tools.

     
    more » « less