skip to main content


Title: Naturalistic Audio-Movies reveal common spatial organization across “visual” cortices of different blind individuals
Abstract

Occipital cortices of different sighted people contain analogous maps of visual information (e.g. foveal vs. peripheral). In congenital blindness, “visual” cortices respond to nonvisual stimuli. Do visual cortices of different blind people represent common informational maps? We leverage naturalistic stimuli and inter-subject pattern similarity analysis to address this question. Blindfolded sighted (n = 22) and congenitally blind (n = 22) participants listened to 6 sound clips (5–7 min each): 3 auditory excerpts from movies; a naturalistic spoken narrative; and matched degraded auditory stimuli (Backwards Speech, scrambled sentences), during functional magnetic resonance imaging scanning. We compared the spatial activity patterns evoked by each unique 10-s segment of the different auditory excerpts across blind and sighted people. Segments of meaningful naturalistic stimuli produced distinctive activity patterns in frontotemporal networks that were shared across blind and across sighted individuals. In the blind group only, segment-specific, cross-subject patterns emerged in visual cortex, but only for meaningful naturalistic stimuli and not Backwards Speech. Spatial patterns of activity within visual cortices are sensitive to time-varying information in meaningful naturalistic auditory stimuli in a broadly similar manner across blind individuals.

 
more » « less
Award ID(s):
1911650
NSF-PAR ID:
10363148
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
Oxford University Press
Date Published:
Journal Name:
Cerebral Cortex
Volume:
33
Issue:
1
ISSN:
1047-3211
Page Range / eLocation ID:
p. 1-10
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. How do life experiences impact cortical function? In people who are born blind, the “visual” cortices are recruited for nonvisual tasks such as Braille reading and sound localization (e.g., Collignon et al., 2011; Sadato et al., 1996). The mechanisms of this recruitment are not known. Do visual cortices have a latent capacity to respond to nonvisual information that is equal throughout the lifespan? Alternatively, is there a sensitive period of heightened plasticity that makes visual cortex repurposing possible during childhood? To gain insight into these questions, we leveraged naturalistic auditory stimuli to quantify and compare cross-modal responses congenitally blind (CB, n=22), adult-onset blind (vision loss >18 years-of-age, AB, n=14) and sighted (n=22) individuals. Participants listened to auditory excerpts from movies; a spoken narrative; and matched meaningless auditory stimuli (i.e., shuffled sentences, backwards speech) during fMRI scanning. These rich naturalistic stimuli made it possible to simultaneous engage a broad range of cognitive domains. We correlated the voxel-wise timecourses of different participants within each group. For all groups, all stimulus conditions induced synchrony in auditory cortex and for all groups only the narrative stimuli synchronized responses in higher-cognitive fronto-parietal and temporal regions. Inter-subject synchrony in visual cortices was high in the CB group for the movie and narrative stimuli but not for meaningless auditory controls. In contrast, visual cortex synchrony was equally low among AB and sighted blindfolded participants. Even many years of blindness in adulthood fail to enable responses to naturalistic auditory information in visual cortices of people who had sight as children. These findings suggest that cross-modal responses in visual cortex of people born blind reflect the plasticity of developing visual cortex during a sensitive period. 
    more » « less
  2. Texting relies on screen-centric prompts designed for sighted users, still posing significant barriers to people who are blind and visually impaired (BVI). Can we re-imagine texting untethered from a visual display? In an interview study, 20 BVI adults shared situations surrounding their texting practices, recurrent topics of conversations, and challenges. Informed by these insights, we introduce TextFlow : a mixed-initiative context-aware system that generates entirely auditory message options relevant to the users’ location, activity, and time of the day. Users can browse and select suggested aural messages using finger-taps supported by an off-the-shelf finger-worn device, without having to hold or attend to a mobile screen. In an evaluative study, 10 BVI participants successfully interacted with TextFlow to browse and send messages in screen-free mode. The experiential response of the users shed light on the importance of bypassing the phone and accessing rapidly controllable messages at their fingertips while preserving privacy and accuracy with respect to speech or screen-based input. We discuss how non-visual access to proactive, contextual messaging can support the blind in a variety of daily scenarios. 
    more » « less
  3. null (Ed.)
    Texting relies on screen-centric prompts designed for sighted users, still posing significant barriers to people who are blind and visually impaired (BVI). Can we re-imagine texting untethered from a visual display? In an interview study, 20 BVI adults shared situations surrounding their texting practices, recurrent topics of conversations, and challenges. Informed by these insights, we introduce TextFlow: a mixed-initiative context-aware system that generates entirely auditory message options relevant to the users’ location, activity, and time of the day. Users can browse and select suggested aural messages using finger-taps supported by an off-the-shelf finger-worn device, without having to hold or attend to a mobile screen. In an evaluative study, 10 BVI participants successfully interacted with TextFlow to browse and send messages in screen-free mode. The experiential response of the users shed light on the importance of bypassing the phone and accessing rapidly controllable messages at their fingertips while preserving privacy and accuracy with respect to speech or screen-based input. We discuss how non-visual access to proactive, contextual messaging can support the blind in a variety of daily scenarios. 
    more » « less
  4. Abstract Lay summary

    Individuals with ASD and schizophrenia are more likely to perceive asynchronous auditory and visual events as occurring simultaneously even if they are well separated in time. We investigated whether similar difficulties in audiovisual temporal processing were present in subclinical populations with high autistic and schizotypal traits. We found that the ability to detect audiovisual asynchrony was not affected by different levels of autistic and schizotypal traits. We also found that connectivity of some brain regions engaging in multisensory and timing tasks might explain an individual's tendency to bind multisensory information within a wide or narrow time window.Autism Res2021, 14: 668–680. © 2020 International Society for Autism Research and Wiley Periodicals LLC

     
    more » « less
  5. Abstract

    Meta‐analytic techniques for mining the neuroimaging literature continue to exert an impact on our conceptualization of functional brain networks contributing to human emotion and cognition. Traditional theories regarding the neurobiological substrates contributing to affective processing are shifting from regional‐ towards more network‐based heuristic frameworks. To elucidate differential brain network involvement linked to distinct aspects of emotion processing, we applied an emergent meta‐analytic clustering approach to the extensive body of affective neuroimaging results archived in the BrainMap database. Specifically, we performed hierarchical clustering on the modeled activation maps from 1,747 experiments in the affective processing domain, resulting in five meta‐analytic groupings of experiments demonstrating whole‐brain recruitment. Behavioral inference analyses conducted for each of these groupings suggested dissociable networks supporting: (1) visual perception within primary and associative visual cortices, (2) auditory perception within primary auditory cortices, (3) attention to emotionally salient information within insular, anterior cingulate, and subcortical regions, (4) appraisal and prediction of emotional events within medial prefrontal and posterior cingulate cortices, and (5) induction of emotional responses within amygdala and fusiform gyri. These meta‐analytic outcomes are consistent with a contemporary psychological model of affective processing in which emotionally salient information from perceived stimuli are integrated with previous experiences to engender a subjective affective response. This study highlights the utility of using emergent meta‐analytic methods to inform and extend psychological theories and suggests that emotions are manifest as the eventual consequence of interactions between large‐scale brain networks.

     
    more » « less