skip to main content


This content will become publicly available on July 1, 2024

Title: Dynamic neural reconstructions of attended object location and features using EEG
Attention allows us to select relevant and ignore irrelevant information from our complex environments. What happens when attention shifts from one item to another? To answer this question, it is critical to have tools that accurately recover neural representations of both feature and location information with high temporal resolution. In the present study, we used human electroencephalography (EEG) and machine learning to explore how neural representations of object features and locations update across dynamic shifts of attention. We demonstrate that EEG can be used to create simultaneous time courses of neural representations of attended features (time point-by-time point inverted encoding model reconstructions) and attended location (time point-by-time point decoding) during both stable periods and across dynamic shifts of attention. Each trial presented two oriented gratings that flickered at the same frequency but had different orientations; participants were cued to attend one of them and on half of trials received a shift cue midtrial. We trained models on a stable period from Hold attention trials and then reconstructed/decoded the attended orientation/location at each time point on Shift attention trials. Our results showed that both feature reconstruction and location decoding dynamically track the shift of attention and that there may be time points during the shifting of attention when 1) feature and location representations become uncoupled and 2) both the previously attended and currently attended orientations are represented with roughly equal strength. The results offer insight into our understanding of attentional shifts, and the noninvasive techniques developed in the present study lend themselves well to a wide variety of future applications. NEW & NOTEWORTHY We used human EEG and machine learning to reconstruct neural response profiles during dynamic shifts of attention. Specifically, we demonstrated that we could simultaneously read out both location and feature information from an attended item in a multistimulus display. Moreover, we examined how that readout evolves over time during the dynamic process of attentional shifts. These results provide insight into our understanding of attention, and this technique carries substantial potential for versatile extensions and applications.  more » « less
Award ID(s):
1848939
NSF-PAR ID:
10450642
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Journal of Neurophysiology
Volume:
130
Issue:
1
ISSN:
0022-3077
Page Range / eLocation ID:
139 to 154
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Key points

    Visual attention involves discrete multispectral oscillatory responses in visual and ‘higher‐order’ prefrontal cortices.

    Prefrontal cortex laterality effects during visual selective attention are poorly characterized.

    High‐definition transcranial direct current stimulation dynamically modulated right‐lateralized fronto‐visual theta oscillations compared to those observed in left fronto‐visual pathways.

    Increased connectivity in right fronto‐visual networks after stimulation of the left dorsolateral prefrontal cortex resulted in faster task performance in the context of distractors.

    Our findings show clear laterality effects in theta oscillatory activity along prefrontal–visual cortical pathways during visual selective attention.

    Abstract

    Studies of visual attention have implicated oscillatory activity in the recognition, protection and temporal organization of attended representations in visual cortices. These studies have also shown that higher‐order regions such as the prefrontal cortex are critical to attentional processing, but far less is understood regarding prefrontal laterality differences in attention processing. To examine this, we selectively applied high‐definition transcranial direct current stimulation (HD‐tDCS) to the left or right dorsolateral prefrontal cortex (DLPFC). We predicted that HD‐tDCS of the leftversusright prefrontal cortex would differentially modulate performance on a visual selective attention task, and alter the underlying oscillatory network dynamics. Our randomized crossover design included 27 healthy adults that underwent three separate sessions of HD‐tDCS (sham, left DLPFC and right DLPFC) for 20 min. Following stimulation, participants completed an attention protocol during magnetoencephalography. The resulting oscillatory dynamics were imaged using beamforming, and peak task‐related neural activity was subjected to dynamic functional connectivity analyses to evaluate the impact of stimulation site (i.e. left and right DLPFC) on neural interactions. Our results indicated that HD‐tDCS over the left DLPFC differentially modulated right fronto‐visual functional connectivity within the theta band compared to HD‐tDCS of the right DLPFC and further, specifically modulated the oscillatory response for detecting targets among an array of distractors. Importantly, these findings provide network‐specific insight into the complex oscillatory mechanisms serving visual selective attention.

     
    more » « less
  2. Feature-based attention is known to enhance visual processing globally across the visual field, even at task-irrelevant locations. Here, we asked whether attention to object categories, in particular faces, shows similar location-independent tuning. Using EEG, we measured the face-selective N170 component of the EEG signal to examine neural responses to faces at task-irrelevant locations while participants attended to faces at another task-relevant location. Across two experiments, we found that visual processing of faces was amplified at task-irrelevant locations when participants attended to faces relative to when participants attended to either buildings or scrambled face parts. The fact that we see this enhancement with the N170 suggests that these attentional effects occur at the earliest stage of face processing. Two additional behavioral experiments showed that it is easier to attend to the same object category across the visual field relative to two distinct categories, consistent with object-based attention spreading globally. Together, these results suggest that attention to high-level object categories shows similar spatially global effects on visual processing as attention to simple, individual, low-level features. 
    more » « less
  3. In the last few years, a large number of experiments have been focused on exploring the possibility of using non-invasive techniques, such as electroencephalography (EEG) and magnetoencephalography (MEG), to identify auditory-related neuromarkers which are modulated by attention. Results from several studies where participants listen to a story narrated by one speaker, while trying to ignore a different story narrated by a competing speaker, suggest the feasibility of extracting neuromarkers that demonstrate enhanced phase locking to the attended speech stream. These promising findings have the potential to be used in clinical applications, such as EEG-driven hearing aids. One major challenge in achieving this goal is the need to devise an algorithm capable of tracking these neuromarkers in real-time when individuals are given the freedom to repeatedly switch attention among speakers at will. Here we present an algorithm pipeline that is designed to efficiently recognize changes of neural speech tracking during a dynamic-attention switching task and to use them as an input for a near real-time state-space model that translates these neuromarkers into attentional state estimates with a minimal delay. This algorithm pipeline was tested with MEG data collected from participants who had the freedom to change the focus of their attention between two speakers at will. Results suggest the feasibility of using our algorithm pipeline to track changes of attention in near-real time in a dynamic auditory scene. 
    more » « less
  4. In everyday social environments, demands on attentional resources dynamically shift to balance our attention to targets of interest while alerting us to important objects in our surrounds.The current study uses electroencephalography to explore how the push-pull interaction between top-down and bottom-up attention manifests itself in dynamic auditory scenes. Using natural soundscapes as distractors while subjects attend to a controlled rhythmic sound sequence, we find that salient events in background scenes significantly suppress phase-locking and gamma responses to the attended sequence, countering enhancement effects observed for attended targets. In line with a hypothesis of limited attentional resources, the modulation of neural activity by bottom-up attention is graded by degree of salience of ambient events. The study also provides insights into the interplay between endogenous and exogenous attention during natural soundscapes, with both forms of attention engaging a common fronto-parietal network at different time lags. 
    more » « less
  5. Abstract

    Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.

     
    more » « less