Abstract Perceptual learning can significantly improve visual sensitivity even in fully matured adults. However, the ability to generalize learning to untrained conditions is often limited. While traditionally, perceptual learning is attributed to practice-dependent plasticity mechanisms, recent studies suggest that brief memory reactivations can efficiently improve visual perception, recruiting higher-level brain regions. Here we provide evidence that similar memory reactivation mechanisms promote generalization of offline learning mechanisms. Human participants encoded a visual discrimination task with the target stimulus at retinotopic location A. Then, brief memory reactivations of only five trials each were performed on separate days at location A. Generalization was tested at retinotopic location B. Results indicate remarkable enhancement of location B performance following memory reactivations, pointing to efficient offline generalization mechanisms. A control experiment with no reactivations showed minimal generalization. These findings suggest that reactivation-induced learning further enhances learning efficiency by promoting offline generalization mechanisms to untrained conditions, and can be further tested in additional learning domains, with potential future clinical implications.
more »
« less
Behavioral asymmetries in visual short-term memory occur in retinotopic coordinates
Visual short-term memory (VSTM) is an essential store that creates continuous representations from disjointed visual input. However, severe capacity limits exist, reflecting constraints in supporting brain networks. VSTM performance shows spatial biases predicted by asymmetries in the brain based upon the location of the remembered object. Visual representations are retinotopic, or relative to location of the representation on the retina. It therefore stands to reason that memory performance may also show retinotopic biases. Here, eye position was manipulated to tease apart retinotopic coordinates from spatiotopic coordinates, or location relative to the external world. Memory performance was measured while participants performed a color change-detection task for items presented across the visual field while subjects fixated central or peripheral position. VSTM biases reflected the location of the stimulus on the retina, regardless of where the stimulus appeared on the screen. Therefore, spatial biases occur in retinotopic coordinates in VSTMand suggest a fundamental link between behavioral VSTM measures and visual representations.
more »
« less
- Award ID(s):
- 1921415
- PAR ID:
- 10533417
- Publisher / Repository:
- Springer
- Date Published:
- Journal Name:
- Attention, Perception, & Psychophysics
- Volume:
- 85
- Issue:
- 1
- ISSN:
- 1943-3921
- Page Range / eLocation ID:
- 113 to 119
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Successful interaction with the environment requires the ability to flexibly allocate resources to different locations in the visual field. Recent evidence suggests that visual short-term memory (VSTM) resources are distributed asymmetrically across the visual field based upon task demands. Here, we propose that context, rather than the stimulus itself, determines asymmetrical distribution of VSTM resources. To test whether context modulates the reallocation of resources to the right visual field, task set, defined by memory-load, was manipulated to influence visual short-term memory performance. Performance was measured for single-feature objects embedded within predominantly single- or two-feature memory blocks. Therefore, context was varied to determine whether task set directly predicts changes in visual field biases. In accord with the dynamic reallocation of resources hypothesis, task set, rather than aspects of the physical stimulus, drove improvements in performance in the right- visual field. Our results show, for the first time, that preparation for upcoming memory demands directly determines how resources are allocated across the visual field.more » « less
-
Here, we report on the long-term stability of changes in behavior and brain activity following perceptual learning of conjunctions of simple motion features. Participants were trained for 3 weeks on a visual search task involving the detection of a dot moving in a “v”-shaped target trajectory among inverted “v”-shaped distractor trajectories. The first and last training sessions were carried out during functional magnetic resonance imaging (fMRI). Learning stability was again examined behaviorally and using fMRI 3 years after the end of training. Results show that acquired behavioral improvements were remarkably stable over time and that these changes were specific to trained target and distractor trajectories. A similar pattern was observed on the neuronal level, when the representation of target and distractor stimuli was examined in early retinotopic visual cortex (V1–V3): training enhanced activity for the target relative to the surrounding distractors in the search array and this enhancement persisted after 3 years. However, exchanging target and distractor trajectories abolished both neuronal and behavioral effects, suggesting that training-induced changes in stimulus representation are specific to trained stimulus identities.more » « less
-
Abstract Investigations into how individual neurons encode behavioral variables of interest have revealed specific representations in single neurons, such as place and object cells, as well as a wide range of cells with conjunctive encodings or mixed selectivity. However, as most experiments examine neural activity within individual tasks, it is currently unclear if and how neural representations change across different task contexts. Within this discussion, the medial temporal lobe is particularly salient, as it is known to be important for multiple behaviors including spatial navigation and memory, however the relationship between these functions is currently unclear. Here, to investigate how representations in single neurons vary across different task contexts in the medial temporal lobe, we collected and analyzed single‐neuron activity from human participants as they completed a paired‐task session consisting of a passive‐viewing visual working memory and a spatial navigation and memory task. Five patients contributed 22 paired‐task sessions, which were spike sorted together to allow for the same putative single neurons to be compared between the different tasks. Within each task, we replicated concept‐related activations in the working memory task, as well as target‐location and serial‐position responsive cells in the navigation task. When comparing neuronal activity between tasks, we first established that a significant number of neurons maintained the same kind of representation, responding to stimuli presentations across tasks. Further, we found cells that changed the nature of their representation across tasks, including a significant number of cells that were stimulus responsive in the working memory task that responded to serial position in the spatial task. Overall, our results support a flexible encoding of multiple, distinct aspects of different tasks by single neurons in the human medial temporal lobe, whereby some individual neurons change the nature of their feature coding between task contexts.more » « less
-
null (Ed.)Our visual system is fundamentally retinotopic. When viewing a stable scene, each eye movement shifts object features and locations on the retina. Thus, sensory representations must be updated, or remapped, across saccades to align presaccadic and postsaccadic inputs. The earliest remapping studies focused on anticipatory, presaccadic shifts of neuronal spatial receptive fields. Over time, it has become clear that there are multiple forms of remapping and that different forms of remapping may be mediated by different neural mechanisms. This review attempts to organize the various forms of remapping into a functional taxonomy based on experimental data and ongoing debates about forward versus convergent remapping, presaccadic versus postsaccadic remapping, and spatial versus attentional remapping. We integrate findings from primate neurophysiological, human neuroimaging and behavioral, and computational modeling studies. We conclude by discussing persistent open questions related to remapping, with specific attention to binding of spatial and featural information during remapping and speculations about remapping's functional significance. Expected final online publication date for the Annual Review of Vision Science, Volume 7 is September 2021. Please see http://www.annualreviews.org/page/journal/pubdates for revised estimates.more » « less
An official website of the United States government

