skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Visual Distraction Disrupts Category-tuned Attentional Filters in Ventral Visual Cortex
Abstract Our behavioral goals shape how we process information via attentional filters that prioritize goal-relevant information, dictating both where we attend and what we attend to. When something unexpected or salient appears in the environment, it captures our spatial attention. Extensive research has focused on the spatiotemporal aspects of attentional capture, but what happens to concurrent nonspatial filters during visual distraction? Here, we demonstrate a novel, broader consequence of distraction: widespread disruption to filters that regulate category-specific object processing. We recorded fMRI while participants viewed arrays of face/house hybrid images. On distractor-absent trials, we found robust evidence for the standard signature of category-tuned attentional filtering: greater BOLD activation in fusiform face area during attend-faces blocks and in parahippocampal place area during attend-houses blocks. However, on trials where a salient distractor (white rectangle) flashed abruptly around a nontarget location, not only was spatial attention captured, but the concurrent category-tuned attentional filter was disrupted, revealing a boost in activation for the to-be-ignored category. This disruption was robust, resulting in errant processing—and early on, prioritization—of goal-inconsistent information. These findings provide a direct test of the filter disruption theory: that in addition to disrupting spatial attention, distraction also disrupts nonspatial attentional filters tuned to goal-relevant information. Moreover, these results reveal that, under certain circumstances, the filter disruption may be so profound as to induce a full reversal of the attentional control settings, which carries novel implications for both theory and real-world perception.  more » « less
Award ID(s):
1848939
PAR ID:
10368421
Author(s) / Creator(s):
; ;
Publisher / Repository:
DOI PREFIX: 10.1162
Date Published:
Journal Name:
Journal of Cognitive Neuroscience
Volume:
34
Issue:
8
ISSN:
0898-929X
Page Range / eLocation ID:
p. 1521-1533
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience. 
    more » « less
  2. Abstract To accurately categorize items, humans learn to selectively attend to the stimulus dimensions that are most relevant to the task. Models of category learning describe how attention changes across trials as labeled stimuli are progressively observed. The Adaptive Attention Representation Model (AARM), for example, provides an account in which categorization decisions are based on the perceptual similarity of a new stimulus to stored exemplars, and dimension-wise attention is updated on every trial in the direction of a feedback-based error gradient. As such, attention modulation as described by AARM requires interactions among processes of orienting, visual perception, memory retrieval, prediction error, and goal maintenance to facilitate learning. The current study explored the neural bases of attention mechanisms using quantitative predictions from AARM to analyze behavioral and fMRI data collected while participants learned novel categories. Generalized linear model analyses revealed patterns of BOLD activation in the parietal cortex (orienting), visual cortex (perception), medial temporal lobe (memory retrieval), basal ganglia (prediction error), and pFC (goal maintenance) that covaried with the magnitude of model-predicted attentional tuning. Results are consistent with AARM's specification of attention modulation as a dynamic property of distributed cognitive systems. 
    more » « less
  3. Feature-based attention is known to enhance visual processing globally across the visual field, even at task-irrelevant locations. Here, we asked whether attention to object categories, in particular faces, shows similar location-independent tuning. Using EEG, we measured the face-selective N170 component of the EEG signal to examine neural responses to faces at task-irrelevant locations while participants attended to faces at another task-relevant location. Across two experiments, we found that visual processing of faces was amplified at task-irrelevant locations when participants attended to faces relative to when participants attended to either buildings or scrambled face parts. The fact that we see this enhancement with the N170 suggests that these attentional effects occur at the earliest stage of face processing. Two additional behavioral experiments showed that it is easier to attend to the same object category across the visual field relative to two distinct categories, consistent with object-based attention spreading globally. Together, these results suggest that attention to high-level object categories shows similar spatially global effects on visual processing as attention to simple, individual, low-level features. 
    more » « less
  4. Visual working memory (VWM) representations interact with attentional guidance, but there is controversy over whether multiple VWM items simultaneously influence attentional guidance. Extant studies relied on continuous variables like response times, which can obscure capture – especially if VWM representations cycle through interactive and non-interactive states. Previous conflicting findings regarding guidance when under high working memory (WM) load may be due to the use of noisier response time measures that mix capture and non-capture trials. Thus, we employed an oculomotor paradigm to characterize discrete attentional capture events under both high and low VWM load. Participants held one or two colors in memory, then executed a saccade to a target disk. On some trials, a distractor (sometimes VWM-matching) appeared simultaneously with the target. Eye movements were more frequently directed to a VWM-matching than a non-matching distractor for both load conditions. However, oculomotor capture by a VWM-matching distractor occurred less frequently under high compared with low load. These results suggest that attention is automatically guided toward items matching only one of two colors held in memory at a time, suggesting that items in VWM may cycle through attention-guiding and not-guiding states when more than one item is held in VWM and the task does not require that multiple items be maintained in an active, attention-guiding state. 
    more » « less
  5. Abstract There has been a long-lasting debate about whether salient stimuli, such as uniquely colored objects, have the ability to automatically distract us. To resolve this debate, it has been suggested that salient stimuli do attract attention but that they can be suppressed to prevent distraction. Some research supporting this viewpoint has focused on a newly discovered ERP component called the distractor positivity (PD), which is thought to measure an inhibitory attentional process. This collaborative review summarizes previous research relying on this component with a specific emphasis on how the PD has been used to understand the ability to ignore distracting stimuli. In particular, we outline how the PD component has been used to gain theoretical insights about how search strategy and learning can influence distraction. We also review alternative accounts of the cognitive processes indexed by the PD component. Ultimately, we conclude that the PD component is a useful tool for understanding inhibitory processes related to distraction and may prove to be useful in other areas of study related to cognitive control. 
    more » « less