skip to main content


Title: Supramodal Mechanisms of the Cognitive Control Network in Uncertainty Processing
Abstract Information processing under conditions of uncertainty requires the involvement of cognitive control. Despite behavioral evidence of the supramodal function (i.e., independent of sensory modality) of cognitive control, the underlying neural mechanism needs to be directly tested. This study used functional magnetic imaging together with visual and auditory perceptual decision-making tasks to examine brain activation as a function of uncertainty in the two stimulus modalities. The results revealed a monotonic increase in activation in the cortical regions of the cognitive control network (CCN) as a function of uncertainty in the visual and auditory modalities. The intrinsic connectivity between the CCN and sensory regions was similar for the visual and auditory modalities. Furthermore, multivariate patterns of activation in the CCN predicted the level of uncertainty within and across stimulus modalities. These findings suggest that the CCN implements cognitive control by processing uncertainty as abstract information independent of stimulus modality.  more » « less
Award ID(s):
1855759
NSF-PAR ID:
10252933
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
Cerebral Cortex
Volume:
30
Issue:
12
ISSN:
1047-3211
Page Range / eLocation ID:
6336 to 6349
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Rhythm perception depends on the ability to predict the onset of rhythmic events. Previous studies indicate beta band modulation is involved in predicting the onset of auditory rhythmic events (Fujioka et al., 2009, 2012; Snyder & Large, 2005). We sought to determine if similar processes are recruited for prediction of visual rhythms by investigating whether beta band activity plays a role in a modality‐dependent manner for rhythm perception. We looked at electroencephalography time–frequency neural correlates of prediction using an omission paradigm with auditory and visual rhythms. By using omissions, we can separate out predictive timing activity from stimulus‐driven activity. We hypothesized that there would be modality‐independent markers of rhythm prediction in induced beta band oscillatory activity, and our results support this hypothesis. We find induced and evoked predictive timing in both auditory and visual modalities. Additionally, we performed an exploratory‐independent components‐based spatial clustering analysis, and describe all resulting clusters. This analysis reveals that there may be overlapping networks of predictive beta activity based on common activation in the parietal and right frontal regions, auditory‐specific predictive beta in bilateral sensorimotor regions, and visually specific predictive beta in midline central, and bilateral temporal/parietal regions. This analysis also shows evoked predictive beta activity in the left sensorimotor region specific to auditory rhythms and implicates modality‐dependent networks for auditory and visual rhythm perception.

     
    more » « less
  2. Abstract

    Speech processing often occurs amid competing inputs from other modalities, for example, listening to the radio while driving. We examined the extent to which dividing attention between auditory and visual modalities (bimodal divided attention) impacts neural processing of natural continuous speech from acoustic to linguistic levels of representation. We recorded electroencephalographic (EEG) responses when human participants performed a challenging primary visual task, imposing low or high cognitive load while listening to audiobook stories as a secondary task. The two dual-task conditions were contrasted with an auditory single-task condition in which participants attended to stories while ignoring visual stimuli. Behaviorally, the high load dual-task condition was associated with lower speech comprehension accuracy relative to the other two conditions. We fitted multivariate temporal response function encoding models to predict EEG responses from acoustic and linguistic speech features at different representation levels, including auditory spectrograms and information-theoretic models of sublexical-, word-form-, and sentence-level representations. Neural tracking of most acoustic and linguistic features remained unchanged with increasing dual-task load, despite unambiguous behavioral and neural evidence of the high load dual-task condition being more demanding. Compared to the auditory single-task condition, dual-task conditions selectively reduced neural tracking of only some acoustic and linguistic features, mainly at latencies >200 ms, while earlier latencies were surprisingly unaffected. These findings indicate that behavioral effects of bimodal divided attention on continuous speech processing occur not because of impaired early sensory representations but likely at later cognitive processing stages. Crossmodal attention-related mechanisms may not be uniform across different speech processing levels.

     
    more » « less
  3. It has been postulated that the brain is organized by “metamodal,” sensory-independent cortical modules capable of performing tasks (e.g., word recognition) in both “standard” and novel sensory modalities. Still, this theory has primarily been tested in sensory-deprived individuals, with mixed evidence in neurotypical subjects, thereby limiting its support as a general principle of brain organization. Critically, current theories of metamodal processing do not specify requirements for successful metamodal processing at the level of neural representations. Specification at this level may be particularly important in neurotypical individuals, where novel sensory modalities must interface with existing representations for the standard sense. Here we hypothesized that effective metamodal engagement of a cortical area requires congruence between stimulus representations in the standard and novel sensory modalities in that region. To test this, we first used fMRI to identify bilateral auditory speech representations. We then trained 20 human participants (12 female) to recognize vibrotactile versions of auditory words using one of two auditory-to-vibrotactile algorithms. The vocoded algorithm attempted to match the encoding scheme of auditory speech while the token-based algorithm did not. Crucially, using fMRI, we found that only in the vocoded group did trained-vibrotactile stimuli recruit speech representations in the superior temporal gyrus and lead to increased coupling between them and somatosensory areas. Our results advance our understanding of brain organization by providing new insight into unlocking the metamodal potential of the brain, thereby benefitting the design of novel sensory substitution devices that aim to tap into existing processing streams in the brain.

    SIGNIFICANCE STATEMENTIt has been proposed that the brain is organized by “metamodal,” sensory-independent modules specialized for performing certain tasks. This idea has inspired therapeutic applications, such as sensory substitution devices, for example, enabling blind individuals “to see” by transforming visual input into soundscapes. Yet, other studies have failed to demonstrate metamodal engagement. Here, we tested the hypothesis that metamodal engagement in neurotypical individuals requires matching the encoding schemes between stimuli from the novel and standard sensory modalities. We trained two groups of subjects to recognize words generated by one of two auditory-to-vibrotactile transformations. Critically, only vibrotactile stimuli that were matched to the neural encoding of auditory speech engaged auditory speech areas after training. This suggests that matching encoding schemes is critical to unlocking the brain's metamodal potential.

     
    more » « less
  4. Abstract

    The development of the ability to anticipate—as manifested by preparatory actions and neural activation related to the expectation of an upcoming stimulus—may play a key role in the ontogeny of cognitive skills more broadly. This preregistered study examined anticipatory brain potentials and behavioral responses (reaction time; RT) to anticipated target stimuli in relation to individual differences in the ability to use goals to direct action (as indexed by measures of executive function; EF). A cross‐sectional investigation was conducted in 40 adults (aged 18–25 years) and 40 children (aged 6–8 years) to examine the association of changes in the amplitude of modality‐specific alpha‐range rhythms in the electroencephalogram (EEG) during anticipation of lateralized visual, tactile, or auditory stimuli with inter‐ and intraindividual variation in RT and EF. Children and adults exhibited contralateral anticipatory reductions in the mu rhythm and the visual alpha rhythm for tactile and visual anticipation, respectively, indicating modality and spatially specific attention allocation. Variability in within‐subject anticipatory alpha lateralization (the difference between contralateral and ipsilateral alpha power) was related to single‐trial RT. This relation was more prominent in adults than in children, and was not apparent for auditory stimuli. Multilevel models indicated that interindividual differences in anticipatory mu rhythm lateralization contributed to the significant association with variability in EF, but this was not the case for visual or auditory alpha rhythms. Exploratory microstate analyses were undertaken to cluster global field power (GFP) into a distribution‐free temporal analysis examining developmental differences across samples and in relation to RT and EF. Anticipation is suggested as a developmental bridge construct connecting neuroscience, behavior, and cognition, with anticipatory EEG oscillations being discussed as quantifiable and potentially malleable indicators of stimulus prediction.

     
    more » « less
  5. N100, the negative peak of electrical response occurring around 100 ms, is present in diverse functional paradigms including auditory, visual, somatic, behavioral and cognitive tasks. We hypothesized that the presence of the N100 across different paradigms may be indicative of a more general property of the cerebral cortex regardless of functional or anatomic specificity. To test this hypothesis, we combined transcranial magnetic stimulation (TMS) and electroencephalography (EEG) to measure cortical excitability by TMS across cortical regions without relying on specific sensory, cognitive or behavioral modalities. The five stimulated regions included left prefrontal, left motor, left primary auditory cortices, the vertex and posterior cerebellum with stimulations performed using supra- and subthreshold intensities. EEG responses produced by TMS stimulation at the five locations all generated N100s that peaked at the vertex. The amplitudes of the N100s elicited by these five diverse cortical origins were statistically not significantly different (all uncorrected p > 0.05). No other EEG response components were found to have this global property of N100. Our findings suggest that anatomy- and modality-specific interpretation of N100 should be carefully evaluated, and N100 by TMS may be used as a bio-marker for evaluating local versus general cortical properties across the brain. 
    more » « less