Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.
A prominent theoretical framework spanning philosophy, psychology, and neuroscience holds that selective attention penetrates early stages of perceptual processing to alter the subjective visual experience of behaviorally relevant stimuli. For example, searching for a red apple at the grocery store might make the relevant color appear brighter and more saturated compared with seeing the exact same red apple while searching for a yellow banana. In contrast, recent proposals argue that data supporting attention-related changes in appearance reflect decision- and motor-level response biases without concurrent changes in perceptual experience. Here, we tested these accounts by evaluating attentional modulations of EEG responses recorded from male and female human subjects while they compared the perceived contrast of attended and unattended visual stimuli rendered at different levels of physical contrast. We found that attention enhanced the amplitude of the P1 component, an early evoked potential measured over visual cortex. A linking model based on signal detection theory suggests that response gain modulations of the P1 component track attention-induced changes in perceived contrast as measured with behavior. In contrast, attentional cues induced changes in the baseline amplitude of posterior alpha band oscillations (∼9-12 Hz), an effect that best accounts for cue-induced response biases, particularly when no stimuli are presented or when competing stimuli are similar and decisional uncertainty is high. The observation of dissociable neural markers that are linked to changes in subjective appearance and response bias supports a more unified theoretical account and demonstrates an approach to isolate subjective aspects of selective information processing.
- Award ID(s):
- 2147064
- NSF-PAR ID:
- 10512575
- Publisher / Repository:
- Society for Neuroscience
- Date Published:
- Journal Name:
- The Journal of Neuroscience
- Volume:
- 43
- Issue:
- 39
- ISSN:
- 0270-6474
- Page Range / eLocation ID:
- 6628 to 6652
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
Abstract Holistic processing of face and non-face stimuli has been framed as a perceptual strategy, with classic hallmarks of holistic processing, such as the composite effect, reflecting a failure of selective attention, which is a consequence of this strategy. Further, evidence that holistic processing is impacted by training different patterns of attentional prioritization suggest that it may be a result of learned attention to the whole, which renders it difficult to attend to only part of a stimulus. If so, holistic processing should be modulated by the same factors that shape attentional selection, such as the probability that distracting or task-relevant information will be present. In contrast, other accounts suggest that it is the match to an internal face template that triggers specialized holistic processing mechanisms. Here we probed these accounts by manipulating the probability, across different testing sessions, that the task-irrelevant face part in the composite face task will contain task-congruent or -incongruent information. Attentional accounts of holistic processing predict that when the probability that the task-irrelevant part contains congruent information is low (25%), holistic processing should be attenuated compared to when this probability is high (75%). In contrast, template-based accounts of holistic face processing predict that it will be unaffected by manipulation given the integrity of the faces remains intact. Experiment 1 found evidence consistent with attentional accounts of holistic face processing and Experiment 2 extends these findings to holistic processing of non-face stimuli. These findings are broadly consistent with learned attention accounts of holistic processing.
-
Abstract Much of our world changes smoothly in time, yet the allocation of attention is typically studied with sudden changes – transients. A sizeable lag in selecting feature information is seen when stimuli change smoothly. Yet this lag is not seen with temporally uncorrelated rapid serial visual presentation (RSVP) stimuli. This suggests that temporal autocorrelation of a feature paradoxically increases the latency at which information is sampled. To test this, participants are asked to report the color of a disk when a cue was presented. There is an increase in selection latency when the disk’s color changed smoothly compared to randomly. This increase is due to the smooth color change presented after the cue rather than extrapolated predictions based on the color changes presented before. These results support an attentional drag theory, whereby attentional engagement is prolonged when features change smoothly. A computational model provides insights into the potential underlying neural mechanisms.
-
Goal-directed visual attention is a fundamental cognitive process that enables animals to selectively focus on specific regions of the visual field while filtering out irrelevant information. However, given the domain specificity of social behaviors, it remains unclear whether attention to faces versus non-faces recruits different neurocognitive processes. In this study, we simultaneously recorded activity from temporal and frontal nodes of the attention network while macaques performed a goal-directed visual search task. V4 and inferotemporal (IT) visual category-selective units, selected during cue presentation, discriminated fixations on targets and distractors during the search, but were differentially engaged by face and house targets. V4 and IT category-selective units also encoded fixation transitions and search dynamics. Compared to distractors, fixations on targets reduced spike-LFP coherence within the temporal cortex. Importantly, target-induced desynchronization between the temporal and prefrontal cortices was only evident for face targets, suggesting that attention to faces differentially engaged the prefrontal cortex. We further revealed bidirectional theta influence between the temporal and prefrontal cortices using Granger causality, which was again disproportionate for faces. Finally, we showed that the search became more efficient with increasing target-induced desynchronization. Together, our results suggest domain specificity for attending to faces and an intricate interplay between visual attention and social processing neural networks.
Significance Statement Visual attention stands as a cornerstone in the tapestry of visual perception. This study explores the neurocognitive mechanisms underlying goal-directed visual attention, specifically in the context of social versus non-social stimuli. By simultaneously recording neural activity from temporal and frontal nodes of the attention network in macaques, we elucidated how attentional processes differed when directed towards social or non-social targets. Our findings revealed distinct neural signatures for social versus non-social stimuli, suggesting domain specificity in the allocation of attentional resources. Moreover, we demonstrated an intricate interplay between visual attention and social processing neural networks, highlighting the complexity of social cognition in primates. These insights advance our understanding of the neural basis of social attention in primates. -
Abstract Attention promotes the selection of behaviorally relevant sensory signals from the barrage of sensory information available. Visual attention modulates the gain of neuronal activity in all visual brain areas examined, although magnitudes of gain modulations vary across areas. For example, attention gain magnitudes in the dorsal lateral geniculate nucleus (LGN) and primary visual cortex (V1) vary tremendously across fMRI measurements in humans and electrophysiological recordings in behaving monkeys. We sought to determine whether these discrepancies are due simply to differences in species or measurement, or more nuanced properties unique to each visual brain area. We also explored whether robust and consistent attention effects, comparable to those measured in humans with fMRI, are observable in the LGN or V1 of monkeys. We measured attentional modulation of multiunit activity in the LGN and V1 of macaque monkeys engaged in a contrast change detection task requiring shifts in covert visual spatial attention. Rigorous analyses of LGN and V1 multiunit activity revealed robust and consistent attentional facilitation throughout V1, with magnitudes comparable to those observed with fMRI. Interestingly, attentional modulation in the LGN was consistently negligible. These findings demonstrate that discrepancies in attention effects are not simply due to species or measurement differences. We also examined whether attention effects correlated with the feature selectivity of recorded multiunits. Distinct relationships suggest that attentional modulation of multiunit activity depends upon the unique structure and function of visual brain areas.