Abstract Rhythm perception depends on the ability to predict the onset of rhythmic events. Previous studies indicate beta band modulation is involved in predicting the onset of auditory rhythmic events (Fujioka et al., 2009, 2012; Snyder & Large, 2005). We sought to determine if similar processes are recruited for prediction of visual rhythms by investigating whether beta band activity plays a role in a modality‐dependent manner for rhythm perception. We looked at electroencephalography time–frequency neural correlates of prediction using an omission paradigm with auditory and visual rhythms. By using omissions, we can separate out predictive timing activity from stimulus‐driven activity. We hypothesized that there would be modality‐independent markers of rhythm prediction in induced beta band oscillatory activity, and our results support this hypothesis. We find induced and evoked predictive timing in both auditory and visual modalities. Additionally, we performed an exploratory‐independent components‐based spatial clustering analysis, and describe all resulting clusters. This analysis reveals that there may be overlapping networks of predictive beta activity based on common activation in the parietal and right frontal regions, auditory‐specific predictive beta in bilateral sensorimotor regions, and visually specific predictive beta in midline central, and bilateral temporal/parietal regions. This analysis also shows evoked predictive beta activity in the left sensorimotor region specific to auditory rhythms and implicates modality‐dependent networks for auditory and visual rhythm perception.
more »
« less
Supramodal Mechanisms of the Cognitive Control Network in Uncertainty Processing
Abstract Information processing under conditions of uncertainty requires the involvement of cognitive control. Despite behavioral evidence of the supramodal function (i.e., independent of sensory modality) of cognitive control, the underlying neural mechanism needs to be directly tested. This study used functional magnetic imaging together with visual and auditory perceptual decision-making tasks to examine brain activation as a function of uncertainty in the two stimulus modalities. The results revealed a monotonic increase in activation in the cortical regions of the cognitive control network (CCN) as a function of uncertainty in the visual and auditory modalities. The intrinsic connectivity between the CCN and sensory regions was similar for the visual and auditory modalities. Furthermore, multivariate patterns of activation in the CCN predicted the level of uncertainty within and across stimulus modalities. These findings suggest that the CCN implements cognitive control by processing uncertainty as abstract information independent of stimulus modality.
more »
« less
- Award ID(s):
- 1855759
- PAR ID:
- 10252933
- Date Published:
- Journal Name:
- Cerebral Cortex
- Volume:
- 30
- Issue:
- 12
- ISSN:
- 1047-3211
- Page Range / eLocation ID:
- 6336 to 6349
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Speech processing often occurs amid competing inputs from other modalities, for example, listening to the radio while driving. We examined the extent to which dividing attention between auditory and visual modalities (bimodal divided attention) impacts neural processing of natural continuous speech from acoustic to linguistic levels of representation. We recorded electroencephalographic (EEG) responses when human participants performed a challenging primary visual task, imposing low or high cognitive load while listening to audiobook stories as a secondary task. The two dual-task conditions were contrasted with an auditory single-task condition in which participants attended to stories while ignoring visual stimuli. Behaviorally, the high load dual-task condition was associated with lower speech comprehension accuracy relative to the other two conditions. We fitted multivariate temporal response function encoding models to predict EEG responses from acoustic and linguistic speech features at different representation levels, including auditory spectrograms and information-theoretic models of sublexical-, word-form-, and sentence-level representations. Neural tracking of most acoustic and linguistic features remained unchanged with increasing dual-task load, despite unambiguous behavioral and neural evidence of the high load dual-task condition being more demanding. Compared to the auditory single-task condition, dual-task conditions selectively reduced neural tracking of only some acoustic and linguistic features, mainly at latencies >200 ms, while earlier latencies were surprisingly unaffected. These findings indicate that behavioral effects of bimodal divided attention on continuous speech processing occur not because of impaired early sensory representations but likely at later cognitive processing stages. Crossmodal attention-related mechanisms may not be uniform across different speech processing levels.more » « less
-
null (Ed.)ABSTRACT Studies have shown that bats are capable of using visual information for a variety of purposes, including navigation and foraging, but the relative contributions of visual and auditory modalities in obstacle avoidance has yet to be fully investigated, particularly in laryngeal echolocating bats. A first step requires the characterization of behavioral responses to different combinations of sensory cues. Here, we quantified the behavioral responses of the insectivorous big brown bat, Eptesicus fuscus, in an obstacle avoidance task offering different combinations of auditory and visual cues. To do so, we utilized a new method that eliminates the confounds typically associated with testing bat vision and precludes auditory cues. We found that the presence of visual and auditory cues together enhances bats' avoidance response to obstacles compared with cues requiring either vision or audition alone. Analyses of flight and echolocation behaviors, such as speed and call rate, did not vary significantly under different obstacle conditions, and thus are not informative indicators of a bat's response to obstacle stimulus type. These findings advance the understanding of the relative importance of visual and auditory sensory modalities in guiding obstacle avoidance behaviors.more » « less
-
It has been postulated that the brain is organized by “metamodal,” sensory-independent cortical modules capable of performing tasks (e.g., word recognition) in both “standard” and novel sensory modalities. Still, this theory has primarily been tested in sensory-deprived individuals, with mixed evidence in neurotypical subjects, thereby limiting its support as a general principle of brain organization. Critically, current theories of metamodal processing do not specify requirements for successful metamodal processing at the level of neural representations. Specification at this level may be particularly important in neurotypical individuals, where novel sensory modalities must interface with existing representations for the standard sense. Here we hypothesized that effective metamodal engagement of a cortical area requires congruence between stimulus representations in the standard and novel sensory modalities in that region. To test this, we first used fMRI to identify bilateral auditory speech representations. We then trained 20 human participants (12 female) to recognize vibrotactile versions of auditory words using one of two auditory-to-vibrotactile algorithms. The vocoded algorithm attempted to match the encoding scheme of auditory speech while the token-based algorithm did not. Crucially, using fMRI, we found that only in the vocoded group did trained-vibrotactile stimuli recruit speech representations in the superior temporal gyrus and lead to increased coupling between them and somatosensory areas. Our results advance our understanding of brain organization by providing new insight into unlocking the metamodal potential of the brain, thereby benefitting the design of novel sensory substitution devices that aim to tap into existing processing streams in the brain. SIGNIFICANCE STATEMENTIt has been proposed that the brain is organized by “metamodal,” sensory-independent modules specialized for performing certain tasks. This idea has inspired therapeutic applications, such as sensory substitution devices, for example, enabling blind individuals “to see” by transforming visual input into soundscapes. Yet, other studies have failed to demonstrate metamodal engagement. Here, we tested the hypothesis that metamodal engagement in neurotypical individuals requires matching the encoding schemes between stimuli from the novel and standard sensory modalities. We trained two groups of subjects to recognize words generated by one of two auditory-to-vibrotactile transformations. Critically, only vibrotactile stimuli that were matched to the neural encoding of auditory speech engaged auditory speech areas after training. This suggests that matching encoding schemes is critical to unlocking the brain's metamodal potential.more » « less
-
null (Ed.)Abstract Working memory (WM) supports the persistent representation of transient sensory information. Visual and auditory stimuli place different demands on WM and recruit different brain networks. Separate auditory- and visual-biased WM networks extend into the frontal lobes, but several challenges confront attempts to parcellate human frontal cortex, including fine-grained organization and between-subject variability. Here, we use differential intrinsic functional connectivity from 2 visual-biased and 2 auditory-biased frontal structures to identify additional candidate sensory-biased regions in frontal cortex. We then examine direct contrasts of task functional magnetic resonance imaging during visual versus auditory 2-back WM to validate those candidate regions. Three visual-biased and 5 auditory-biased regions are robustly activated bilaterally in the frontal lobes of individual subjects (N = 14, 7 women). These regions exhibit a sensory preference during passive exposure to task stimuli, and that preference is stronger during WM. Hierarchical clustering analysis of intrinsic connectivity among novel and previously identified bilateral sensory-biased regions confirms that they functionally segregate into visual and auditory networks, even though the networks are anatomically interdigitated. We also observe that the frontotemporal auditory WM network is highly selective and exhibits strong functional connectivity to structures serving non-WM functions, while the frontoparietal visual WM network hierarchically merges into the multiple-demand cognitive system.more » « less
An official website of the United States government

