Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.
Much of our world changes smoothly in time, yet the allocation of attention is typically studied with sudden changes – transients. A sizeable lag in selecting feature information is seen when stimuli change smoothly. Yet this lag is not seen with temporally uncorrelated rapid serial visual presentation (RSVP) stimuli. This suggests that temporal autocorrelation of a feature paradoxically increases the latency at which information is sampled. To test this, participants are asked to report the color of a disk when a cue was presented. There is an increase in selection latency when the disk’s color changed smoothly compared to randomly. This increase is due to the smooth color change presented after the cue rather than extrapolated predictions based on the color changes presented before. These results support an attentional drag theory, whereby attentional engagement is prolonged when features change smoothly. A computational model provides insights into the potential underlying neural mechanisms.
more » « less- Award ID(s):
- 1734220
- NSF-PAR ID:
- 10154005
- Publisher / Repository:
- Nature Publishing Group
- Date Published:
- Journal Name:
- Nature Communications
- Volume:
- 11
- Issue:
- 1
- ISSN:
- 2041-1723
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Abstract -
A prominent theoretical framework spanning philosophy, psychology, and neuroscience holds that selective attention penetrates early stages of perceptual processing to alter the subjective visual experience of behaviorally relevant stimuli. For example, searching for a red apple at the grocery store might make the relevant color appear brighter and more saturated compared with seeing the exact same red apple while searching for a yellow banana. In contrast, recent proposals argue that data supporting attention-related changes in appearance reflect decision- and motor-level response biases without concurrent changes in perceptual experience. Here, we tested these accounts by evaluating attentional modulations of EEG responses recorded from male and female human subjects while they compared the perceived contrast of attended and unattended visual stimuli rendered at different levels of physical contrast. We found that attention enhanced the amplitude of the P1 component, an early evoked potential measured over visual cortex. A linking model based on signal detection theory suggests that response gain modulations of the P1 component track attention-induced changes in perceived contrast as measured with behavior. In contrast, attentional cues induced changes in the baseline amplitude of posterior alpha band oscillations (∼9-12 Hz), an effect that best accounts for cue-induced response biases, particularly when no stimuli are presented or when competing stimuli are similar and decisional uncertainty is high. The observation of dissociable neural markers that are linked to changes in subjective appearance and response bias supports a more unified theoretical account and demonstrates an approach to isolate subjective aspects of selective information processing.
SIGNIFICANCE STATEMENT Does attention alter visual appearance, or does it simply induce response bias? In the present study, we examined these competing accounts using EEG and linking models based on signal detection theory. We found that response gain modulations of the visually evoked P1 component best accounted for attention-induced changes in visual appearance. In contrast, cue-induced baseline shifts in alpha band activity better explained response biases. Together, these results suggest that attention concurrently impacts visual appearance and response bias, and that these processes can be experimentally isolated. -
Prominent theories of visual working memory postulate that the capacity to maintain a particular visual feature is fixed. In contrast to these theories, recent studies have demonstrated that meaningful objects are better remembered than simple, nonmeaningful stimuli. Here, we tested whether this is solely because meaningful stimuli can recruit additional features—and thus more storage capacity—or whether simple visual features that are not themselves meaningful can also benefit from being part of a meaningful object. Across five experiments (30 young adults each), we demonstrated that visual working memory capacity for color is greater when colors are part of recognizable real-world objects compared with unrecognizable objects. Our results indicate that meaningful stimuli provide a potent scaffold to help maintain simple visual feature information, possibly because they effectively increase the objects’ distinctiveness from each other and reduce interference.
-
Context contributes to multiple aspects of human episodic memory including segmentation and retrieval. The present studies tested if, in adult male and female mice, context influences the encoding of odors encountered in a single unsupervised sampling session of the type used for the routine acquisition of episodic memories. The three paradigms used differed in complexity (single vs. multiple odor cues) and period from sampling to testing. Results show that males consistently encode odors in a context-dependent manner: the mice discriminated novel from previously sampled cues when tested in the chamber of initial cue sampling but not in a distinct yet familiar chamber. This was independent of the interval between cue encounters or the latency from initial sampling to testing. In contrast, female mice acquired both single cues and the elements of multi-cue episodes, but recall of that information was dependent upon the surrounding context only when the cues were presented serially. These results extend the list of episodic memory features expressed by rodents and also introduce a striking and unexpected sex difference in context effects.
-
Abstract Previously rewarded stimuli slow response times (RTs) during visual search, despite being physically non-salient and no longer task-relevant or rewarding. Such value-driven attentional capture (VDAC) has been measured in a training-test paradigm. In the training phase, the search target is rendered in one of two colors (one predicting high reward and the other low reward). In this study, we modified this traditional training phase to include pre-cues that signaled reliable or unreliable information about the trial-to-trial color of the training phase search target. Reliable pre-cues indicated the upcoming target color with certainty, whereas unreliable pre-cues indicated the target was equally likely to be one of two distinct colors. Thus reliable and unreliable pre-cues provided certain and uncertain information, respectively, about the magnitude of the upcoming reward. We then tested for VDAC in a traditional test phase. We found that unreliably pre-cued distractors slowed RTs and drew more initial eye movements during search for the test-phase target, relative to reliably pre-cued distractors, thus providing novel evidence for an influence of information reliability on attentional capture. That said, our experimental manipulation also eliminated
value-dependency (i.e., slowed RTs when a high-reward-predicting distractor was present relative to a low-reward-predicting distractor) for both kinds of distractors. Taken together, these results suggest that target-color uncertainty, rather than reward magnitude, played a critical role in modulating the allocation of value-driven attention in this study.