In the natural environment, we often form stable perceptual experiences from ambiguous and fleeting sensory inputs. Which neural activity underlies the content of perception and which neural activity supports perceptual stability remains an open question. We used a bistable perception paradigm involving ambiguous images to behaviorally dissociate perceptual content from perceptual stability, and magnetoencephalography to measure whole-brain neural dynamics in humans. Combining multivariate decoding and neural state-space analyses, we found frequency-band-specific neural signatures that underlie the content of perception and promote perceptual stability, respectively. Across different types of images, non-oscillatory neural activity in the slow cortical potential (<5 Hz) range supported the content of perception. Perceptual stability was additionally influenced by the amplitude of alpha and beta oscillations. In addition, neural activity underlying perceptual memory, which supports perceptual stability when sensory input is temporally removed from view, also encodes elapsed time. Together, these results reveal distinct neural mechanisms that support the content versus stability of visual perception.
more »
« less
Neural oscillations promoting perceptual stability and perceptual memory during bistable perception
Abstract Ambiguous images elicit bistable perception, wherein periods of momentary perceptual stability are interrupted by sudden perceptual switches. When intermittently presented, ambiguous images trigger a perceptual memory trace in the intervening blank periods. Understanding the neural bases of perceptual stability and perceptual memory during bistable perception may hold clues for explaining the apparent stability of visual experience in the natural world, where ambiguous and fleeting images are prevalent. Motivated by recent work showing the involvement of the right inferior frontal gyrus (rIFG) in bistable perception, we conducted a transcranial direct-current stimulation (tDCS) study with a double-blind, within-subject cross-over design to test a potential causal role of rIFG in these processes. Subjects viewed ambiguous images presented continuously or intermittently while under EEG recording. We did not find any significant tDCS effect on perceptual behavior. However, the fluctuations of oscillatory power in the alpha and beta bands predicted perceptual stability, with higher power corresponding to longer percept durations. In addition, higher alpha and beta power predicted enhanced perceptual memory during intermittent viewing. These results reveal a unified neurophysiological mechanism sustaining perceptual stability and perceptual memory when the visual system is faced with ambiguous input.
more »
« less
- Award ID(s):
- 1753218
- PAR ID:
- 10381772
- Publisher / Repository:
- Nature Publishing Group
- Date Published:
- Journal Name:
- Scientific Reports
- Volume:
- 12
- Issue:
- 1
- ISSN:
- 2045-2322
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Each view of our environment captures only a subset of our immersive surroundings. Yet, our visual experience feels seamless. A puzzle for human neuroscience is to determine what cognitive mechanisms enable us to overcome our limited field of view and efficiently anticipate new views as we sample our visual surroundings. Here, we tested whether memory-based predictions of upcoming scene views facilitate efficient perceptual judgments across head turns. We tested this hypothesis using immersive, head-mounted virtual reality (VR). After learning a set of immersive real-world environments, participants (n = 101 across 4 experiments) were briefly primed with a single view from a studied environment and then turned left or right to make a perceptual judgment about an adjacent scene view. We found that participants’ perceptual judgments were faster when they were primed with images from the same (vs. neutral or different) environments. Importantly, priming required memory: it only occurred in learned (vs. novel) environments, where the link between adjacent scene views was known. Further, consistent with a role in supporting active vision, priming only occurred in the direction of planned head turns and only benefited judgments for scene views presented in their learned spatiotopic positions. Taken together, we propose that memory-based predictions facilitate rapid perception across large-scale visual actions, such as head and body movements, and may be critical for efficient behavior in complex immersive environments.more » « less
-
null (Ed.)A degraded, black-and-white image of an object, which appears meaningless on first presentation, is easily identified after a single exposure to the original, intact image. This striking example of perceptual learning reflects a rapid (one-trial) change in performance, but the kind of learning that is involved is not known. We asked whether this learning depends on conscious (hippocampus-dependent) memory for the images that have been presented or on an unconscious (hippocampus-independent) change in the perception of images, independently of the ability to remember them. We tested five memory-impaired patients with hippocampal lesions or larger medial temporal lobe (MTL) lesions. In comparison to volunteers, the patients were fully intact at perceptual learning, and their improvement persisted without decrement from 1 d to more than 5 mo. Yet, the patients were impaired at remembering the test format and, even after 1 d, were impaired at remembering the images themselves. To compare perceptual learning and remembering directly, at 7 d after seeing degraded images and their solutions, patients and volunteers took either a naming test or a recognition memory test with these images. The patients improved as much as the volunteers at identifying the degraded images but were severely impaired at remembering them. Notably, the patient with the most severe memory impairment and the largest MTL lesions performed worse than the other patients on the memory tests but was the best at perceptual learning. The findings show that one-trial, long-lasting perceptual learning relies on hippocampus-independent (nondeclarative) memory, independent of any requirement to consciously remember.more » « less
-
Abstract Perceptual learning can significantly improve visual sensitivity even in fully matured adults. However, the ability to generalize learning to untrained conditions is often limited. While traditionally, perceptual learning is attributed to practice-dependent plasticity mechanisms, recent studies suggest that brief memory reactivations can efficiently improve visual perception, recruiting higher-level brain regions. Here we provide evidence that similar memory reactivation mechanisms promote generalization of offline learning mechanisms. Human participants encoded a visual discrimination task with the target stimulus at retinotopic location A. Then, brief memory reactivations of only five trials each were performed on separate days at location A. Generalization was tested at retinotopic location B. Results indicate remarkable enhancement of location B performance following memory reactivations, pointing to efficient offline generalization mechanisms. A control experiment with no reactivations showed minimal generalization. These findings suggest that reactivation-induced learning further enhances learning efficiency by promoting offline generalization mechanisms to untrained conditions, and can be further tested in additional learning domains, with potential future clinical implications.more » « less
-
null (Ed.)Abstract A listener's interpretation of a given speech sound can vary probabilistically from moment to moment. Previous experience (i.e., the contexts in which one has encountered an ambiguous sound) can further influence the interpretation of speech, a phenomenon known as perceptual learning for speech. This study used multivoxel pattern analysis to query how neural patterns reflect perceptual learning, leveraging archival fMRI data from a lexically guided perceptual learning study conducted by Myers and Mesite [Myers, E. B., & Mesite, L. M. Neural systems underlying perceptual adjustment to non-standard speech tokens. Journal of Memory and Language, 76, 80–93, 2014]. In that study, participants first heard ambiguous /s/–/∫/ blends in either /s/-biased lexical contexts (epi_ode) or /∫/-biased contexts (refre_ing); subsequently, they performed a phonetic categorization task on tokens from an /asi/–/a∫i/ continuum. In the current work, a classifier was trained to distinguish between phonetic categorization trials in which participants heard unambiguous productions of /s/ and those in which they heard unambiguous productions of /∫/. The classifier was able to generalize this training to ambiguous tokens from the middle of the continuum on the basis of individual participants' trial-by-trial perception. We take these findings as evidence that perceptual learning for speech involves neural recalibration, such that the pattern of activation approximates the perceived category. Exploratory analyses showed that left parietal regions (supramarginal and angular gyri) and right temporal regions (superior, middle, and transverse temporal gyri) were most informative for categorization. Overall, our results inform an understanding of how moment-to-moment variability in speech perception is encoded in the brain.more » « less
An official website of the United States government
