The lateralized ERP N2pc component has been shown to be an effective marker of attentional object selection when elicited in a visual search task, specifically reflecting the selection of a target item among distractors. Moreover, when targets are known in advance, the visual search process is guided by representations of target features held in working memory at the time of search, thus guiding attention to objects with target-matching features. Previous studies have shown that manipulating working memory availability via concurrent tasks or within task manipulations influences visual search performance and the N2pc. Other studies have indicated that visual (non-spatial) vs. spatial working memory manipulations have differential contributions to visual search. To investigate this the current study assesses participants' visual and spatial working memory ability independent of the visual search task to determine whether such individual differences in working memory affect task performance and the N2pc. Participants ( n = 205) completed a visual search task to elicit the N2pc and separate visual working memory (VWM) and spatial working memory (SPWM) assessments. Greater SPWM, but not VWM, ability is correlated with and predicts higher visual search accuracy and greater N2pc amplitudes. Neither VWM nor SPWM was related to N2pc latency. These results provide additional support to prior behavioral and neural visual search findings that spatial WM availability, whether as an ability of the participant's processing system or based on task demands, plays an important role in efficient visual search.
more »
« less
Multiple states in visual working memory: Evidence from oculomotor capture by memory-matching distractors
Visual working memory (VWM) representations interact with attentional guidance, but there is controversy over whether multiple VWM items simultaneously influence attentional guidance. Extant studies relied on continuous variables like response times, which can obscure capture – especially if VWM representations cycle through interactive and non-interactive states. Previous conflicting findings regarding guidance when under high working memory (WM) load may be due to the use of noisier response time measures that mix capture and non-capture trials. Thus, we employed an oculomotor paradigm to characterize discrete attentional capture events under both high and low VWM load. Participants held one or two colors in memory, then executed a saccade to a target disk. On some trials, a distractor (sometimes VWM-matching) appeared simultaneously with the target. Eye movements were more frequently directed to a VWM-matching than a non-matching distractor for both load conditions. However, oculomotor capture by a VWM-matching distractor occurred less frequently under high compared with low load. These results suggest that attention is automatically guided toward items matching only one of two colors held in memory at a time, suggesting that items in VWM may cycle through attention-guiding and not-guiding states when more than one item is held in VWM and the task does not require that multiple items be maintained in an active, attention-guiding state.
more »
« less
- Award ID(s):
- 1632849
- PAR ID:
- 10147315
- Date Published:
- Journal Name:
- Psychonomic bulletin review
- Volume:
- 26
- ISSN:
- 1531-5320
- Page Range / eLocation ID:
- 1340–1346
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Selective attention improves sensory processing of relevant information but can also impact the quality of perception. For example, attention increases visual discrimination performance and at the same time boosts apparent stimulus contrast of attended relative to unattended stimuli. Can attention also lead to perceptual distortions of visual representations? Optimal tuning accounts of attention suggest that processing is biased towards “off-tuned” features to maximize the signal-to-noise ratio in favor of the target, especially when targets and distractors are confusable. Here, we tested whether such tuning gives rise to phenomenological changes of visual features. We instructed participants to select a color among other colors in a visual search display and subsequently asked them to judge the appearance of the target color in a 2-alternative forced choice task. Participants consistently judged the target color to appear more dissimilar from the distractor color in feature space. Critically, the magnitude of these perceptual biases varied systematically with the similarity between target and distractor colors during search, indicating that attentional tuning quickly adapts to current task demands. In control experiments we rule out possible non-attentional explanations such as color contrast or memory effects. Overall, our results demonstrate that selective attention warps the representational geometry of color space, resulting in profound perceptual changes across large swaths of feature space. Broadly, these results indicate that efficient attentional selection can come at a perceptual cost by distorting our sensory experience.more » « less
-
Abstract Previously rewarded stimuli slow response times (RTs) during visual search, despite being physically non-salient and no longer task-relevant or rewarding. Such value-driven attentional capture (VDAC) has been measured in a training-test paradigm. In the training phase, the search target is rendered in one of two colors (one predicting high reward and the other low reward). In this study, we modified this traditional training phase to include pre-cues that signaled reliable or unreliable information about the trial-to-trial color of the training phase search target. Reliable pre-cues indicated the upcoming target color with certainty, whereas unreliable pre-cues indicated the target was equally likely to be one of two distinct colors. Thus reliable and unreliable pre-cues provided certain and uncertain information, respectively, about the magnitude of the upcoming reward. We then tested for VDAC in a traditional test phase. We found that unreliably pre-cued distractors slowed RTs and drew more initial eye movements during search for the test-phase target, relative to reliably pre-cued distractors, thus providing novel evidence for an influence of information reliability on attentional capture. That said, our experimental manipulation also eliminatedvalue-dependency(i.e.,slowed RTs when a high-reward-predicting distractor was present relative to a low-reward-predicting distractor) for both kinds of distractors. Taken together, these results suggest that target-color uncertainty, rather than reward magnitude, played a critical role in modulating the allocation of value-driven attention in this study.more » « less
-
Multiple types of memory guide attention: Both long-term memory (LTM) and working memory (WM) effectively guide visual search. Furthermore, both types of memories can capture attention automatically, even when detrimental to performance. It is less clear, however, how LTM and WM cooperate or compete to guide attention in the same task. In a series of behavioral experiments, we show that LTM and WM reliably cooperate to guide attention: Visual search is faster when both memories cue attention to the same spatial location (relative to when only one memory can guide attention). LTM and WM competed to guide attention in more limited circumstances: Competition only occurred when these memories were in different dimensions – particularly when participants searched for a shape and held an accessory color in mind. Finally, we found no evidence for asymmetry in either cooperation or competition: There was no evidence that WM helped (or hindered) LTM-guided search more than the other way around. This lack of asymmetry was found despite differences in LTM-guided and WM-guided search overall, and differences in how two LTMs and two WMs compete or cooperate with each other to guide attention. This work suggests that, even if only one memory is currently task-relevant, WM and LTM can cooperate to guide attention; they can also compete when distracting features are salient enough. This work elucidates interactions between WM and LTM during attentional guidance, adding to the literature on costs and benefits to attention from multiple active memories.more » « less
-
Cognitive processes have been found to contribute substantially to the human errors that lead to construction accidents. Working memory—a cognitive system with a limited capacity that is responsible for temporarily holding information available for processing—plays an important role in reasoning and decision-making. Since eye movements indicate where a worker directs his/her attention, tracking such movements provides a practical way to measure workers’ attention and comprehension of construction hazards. As a departure in construction industry research, this study correlates attentional allocation with working memory to assess workers’ situation awareness under different scenarios that expose workers to various hazards. To achieve this goal, this study merges research linking eye movements and workers’ attention with research focused on working-memory load and decision making and evaluates what, how, and where a worker distributes his/her attention while performing a task under different working-memory loads. Path analysis models then examined the direct and indirect effect of different working-memory loads on hazard identification performance. The independent variable (working-memory load) is linked to the dependent variable (hazard identification) through the set of mediators (attention metrics). The results showed that the high-memory load condition delayed workers’ hazard identification. The findings of this study emphasize the important role working memory plays in determining how and why workers in dynamic work environments fail to detect, comprehend, and/or respond to physical risks.more » « less
An official website of the United States government

