We continue our study from [5], of how concepts that have hierarchical structure might be represented in brain-like neural networks, how these representations might be used to recognize the concepts, and how these representations might be learned. In [5], we considered simple tree-structured concepts and feed-forward layered networks. Here we extend the model in two ways: we allow limited overlap between children of different concepts, and we allow networks to include feedback edges. For these more general cases, we describe and analyze algorithms for recognition and algorithms for learning.
more »
« less
Virtual Reality as a Context for Adaptation
The COVID-19 pandemic has accelerated interest in virtual reality (VR) for education, entertainment, telerehabilitation, and skills training. As the frequency and duration of VR engagement increases—the number of people in the United States using VR at least once per month is forecasted to exceed 95 million—it is critical to understand how VR engagement influences brain and behavior. Here, we evaluate neurophysiological effects of sensory conflicts induced by VR engagement and posit an intriguing hypothesis: the brain processes VR as a unique “context” leading to the formation and maintenance of independent sensorimotor representations. We discuss known VR-induced sensorimotor adaptations to illustrate how VR might manifest as a context for learning and how technological and human factors might mediate the context-dependency of sensorimotor representations learned in VR.
more »
« less
- PAR ID:
- 10357673
- Date Published:
- Journal Name:
- Frontiers in Virtual Reality
- Volume:
- 2
- ISSN:
- 2673-4192
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Virtual reality (VR) has a high potential to facilitate education. However, the design of many VR learning applications was criticized for lacking the guidance of explicit and appropriate learning theories. To advance the use of VR in effective instruction, this study proposed a model that extended the cognitive-affective theory of learning with media (CATLM) into a VR learning context and evaluated this model using a structural equation modeling (SEM) approach. Undergraduate students ( n = 77) learned about the solar system in a VR environment over three sessions. Overall, the results supported the core principles and assumptions of CATLM in a VR context (CATLM-VR). In addition, the CATLM-VR model illustrated how immersive VR may impact learning. Specifically, immersion had an overall positive impact on user experience and motivation. However, the impact of immersion on cognitive load was uncertain, and that uncertainty made the final learning outcomes less predictable. Enhancing students’ motivation and cognitive engagement may more directly increase learning achievement than increasing the level of immersion and may be more universally applicable in VR instruction.more » « less
-
Learned olfactory-guided navigation is a powerful platform for studying how a brain generates goal-directed behaviors. However, the quantitative changes that occur in sensorimotor transformations and the underlying neural circuit substrates to generate such learning-dependent navigation is still unclear. Here we investigate learned sensorimotor processing for navigation in the nematodeCaenorhabditis elegansby measuring and modeling experience-dependent odor and salt chemotaxis. We then explore the neural basis of learned odor navigation through perturbation experiments. We develop a novel statistical model to characterize how the worm employs two behavioral strategies: a biased random walk and weathervaning. We infer weights on these strategies and characterize sensorimotor kernels that govern them by fitting our model to the worm’s time-varying navigation trajectories and precise sensory experiences. After olfactory learning, the fitted odor kernels reflect how appetitive and aversive trained worms up- and down-regulate both strategies, respectively. The model predicts an animal’s past olfactory learning experience with > 90%accuracy given finite observations, outperforming a classical chemotaxis metric. The model trained on natural odors further predicts the animals’ learning-dependent response to optogenetically induced odor perception. Our measurements and model show that behavioral variability is altered by learning—trained worms exhibit less variable navigation than naive ones. Genetically disrupting individual interneuron classes downstream of an odor-sensing neuron reveals that learned navigation strategies are distributed in the network. Together, we present a flexible navigation algorithm that is supported by distributed neural computation in a compact brain.more » « less
-
Intelligent Virtual Agents (IVAs) received enormous attention in recent years due to significant improvements in voice communication technologies and the convergence of different research fields such as Machine Learning, Internet of Things, and Virtual Reality (VR). Interactive conversational IVAs can appear in different forms such as voice-only or with embodied audio-visual representations showing, for example, human-like contextually related or generic three-dimensional bodies. In this paper, we analyzed the benefits of different forms of virtual agents in the context of a VR exhibition space. Our results suggest positive evidence showing large benefits of both embodied and thematically related audio-visual representations of IVAs. We discuss implications and suggestions for content developers to design believable virtual agents in the context of such installations.more » « less
-
It has been postulated that the brain is organized by “metamodal,” sensory-independent cortical modules capable of performing tasks (e.g., word recognition) in both “standard” and novel sensory modalities. Still, this theory has primarily been tested in sensory-deprived individuals, with mixed evidence in neurotypical subjects, thereby limiting its support as a general principle of brain organization. Critically, current theories of metamodal processing do not specify requirements for successful metamodal processing at the level of neural representations. Specification at this level may be particularly important in neurotypical individuals, where novel sensory modalities must interface with existing representations for the standard sense. Here we hypothesized that effective metamodal engagement of a cortical area requires congruence between stimulus representations in the standard and novel sensory modalities in that region. To test this, we first used fMRI to identify bilateral auditory speech representations. We then trained 20 human participants (12 female) to recognize vibrotactile versions of auditory words using one of two auditory-to-vibrotactile algorithms. The vocoded algorithm attempted to match the encoding scheme of auditory speech while the token-based algorithm did not. Crucially, using fMRI, we found that only in the vocoded group did trained-vibrotactile stimuli recruit speech representations in the superior temporal gyrus and lead to increased coupling between them and somatosensory areas. Our results advance our understanding of brain organization by providing new insight into unlocking the metamodal potential of the brain, thereby benefitting the design of novel sensory substitution devices that aim to tap into existing processing streams in the brain. SIGNIFICANCE STATEMENTIt has been proposed that the brain is organized by “metamodal,” sensory-independent modules specialized for performing certain tasks. This idea has inspired therapeutic applications, such as sensory substitution devices, for example, enabling blind individuals “to see” by transforming visual input into soundscapes. Yet, other studies have failed to demonstrate metamodal engagement. Here, we tested the hypothesis that metamodal engagement in neurotypical individuals requires matching the encoding schemes between stimuli from the novel and standard sensory modalities. We trained two groups of subjects to recognize words generated by one of two auditory-to-vibrotactile transformations. Critically, only vibrotactile stimuli that were matched to the neural encoding of auditory speech engaged auditory speech areas after training. This suggests that matching encoding schemes is critical to unlocking the brain's metamodal potential.more » « less
An official website of the United States government

