skip to main content

Title: Influence of sensory modality and control dynamics on human path integration
Path integration is a sensorimotor computation that can be used to infer latent dynamical states by integrating self-motion cues. We studied the influence of sensory observation (visual/vestibular) and latent control dynamics (velocity/acceleration) on human path integration using a novel motion-cueing algorithm. Sensory modality and control dynamics were both varied randomly across trials, as participants controlled a joystick to steer to a memorized target location in virtual reality. Visual and vestibular steering cues allowed comparable accuracies only when participants controlled their acceleration, suggesting that vestibular signals, on their own, fail to support accurate path integration in the absence of sustained acceleration. Nevertheless, performance in all conditions reflected a failure to fully adapt to changes in the underlying control dynamics, a result that was well explained by a bias in the dynamics estimation. This work demonstrates how an incorrect internal model of control dynamics affects navigation in volatile environments in spite of continuous sensory feedback.
Authors:
; ; ; ;
Award ID(s):
1707400
Publication Date:
NSF-PAR ID:
10380463
Journal Name:
eLife
Volume:
11
ISSN:
2050-084X
Sponsoring Org:
National Science Foundation
More Like this
  1. Our brain perceives the world by exploiting multisensory cues to extract information about various aspects of external stimuli. The sensory cues from the same stimulus should be integrated to improve perception, and otherwise segregated to distinguish different stimuli. In reality, however, the brain faces the challenge of recognizing stimuli without knowing in advance the sources of sensory cues. To address this challenge, we propose that the brain conducts integration and segregation concurrently with complementary neurons. Studying the inference of heading-direction via visual and vestibular cues, we develop a network model with two reciprocally connected modules modeling interacting visual-vestibular areas. In each module, there are two groups of neurons whose tunings under each sensory cue are either congruent or opposite. We show that congruent neurons implement integration, while opposite neurons compute cue disparity information for segregation, and the interplay between two groups of neurons achieves efficient multisensory information processing.
  2. Redirected and amplified head movements have the potential to provide more natural interaction with virtual environments (VEs) than using controller-based input, which causes large discrepancies between visual and vestibular self-motion cues and leads to increased VR sickness. However, such amplified head movements may also exacerbate VR sickness symptoms over no amplification. Several general methods have been introduced to reduce VR sickness for controller-based input inside a VE, including a popular vignetting method that gradually reduces the field of view. In this paper, we investigate the use of vignetting to reduce VR sickness when using amplified head rotations instead of controllerbased input. We also investigate whether the induced VR sickness is a result of the user’s head acceleration or velocity by introducing two different modes of vignetting, one triggered by acceleration and the other by velocity. Our dependent measures were pre and post VR sickness questionnaires as well as estimated discomfort levels that were assessed each minute of the experiment. Our results show interesting effects between a baseline condition without vignetting, as well as the two vignetting methods, generally indicating that the vignetting methods did not succeed in reducing VR sickness for most of the participants and, instead, lead to amore »significant increase. We discuss the results and potential explanations of our findings.« less
  3. Objective: To determine if a vestibular prosthesis could improve function in subjects with severe vestibular damage and could be used it as a scientific tool to investigate central vestibular processing. Background: Damage to the vestibular labyrinth is common and usually permanent. We therefore developed and tested a vestibular implant (VI) that is designed to mimic the information normally provided by the vestibular labyrinth to determine if we can reduce vestibular-mediated deficits and study temporal integration of sensory cues in the brain. Design/Methods: Monkeys had electrodes implanted in the semicircular canals of one ear and then severe bilateral vestibular damage was induced with aminoglycosides. Eye movements, perception, and balance were tested before and after vestibular damage and with the VI activated, which supplied head motion information to the brain via electrical stimulation delivered by the implanted electrodes. Humans also had electrode implantation (done in conjunction with a cochlear implant, CI) and they were tested on a temporal binding psychophysical task Results: Stimulation provided by VI in vestibulopathic monkeys improved their balance, perception of spatial orientation, and eye movement responses. Timing experiments in humans using CI and VI stimuli showed that unlike past experiments that used motion to generate the vestibular signal,more »CI and VI signals were received by the cerebral cortex with the same latency and were perceived as simultaneous, but this timing perception was highly sensitive to adaption. Conclusions: VI improves oculomotor, postural, and perceptual behavior in vestibulopathic monkeys and could prove to be an effective way to improve these functions in patients with permanent labyrinthine damage. Timing experiments show that when novel stimuli are used, the brain synthesizes them in accordance with their arrival at the cortex, but that experience can rapidly recalibrate this timing relationship, which may be why normal stimuli that are experienced habitually lack this characteristic.« less
  4. Limb dominance is evident in many daily activities, leading to the prominent idea that each hemisphere of the brain specializes in controlling different aspects of movement. Past studies suggest that the dominant arm is primarily controlled via an internal model of limb dynamics that enables the nervous system to produce efficient movements. In contrast, the nondominant arm may be primarily controlled via impedance mechanisms that rely on the strong modulation of sensory feedback from individual joints to control limb posture. We tested whether such differences are evident in behavioral responses and stretch reflexes following sudden displacement of the arm during posture control. Experiment 1 applied specific combinations of elbow-shoulder torque perturbations (the same for all participants). Peak joint displacements, return times, end point accuracy, and the directional tuning and amplitude of stretch reflexes in nearly all muscles were not statistically different between the two arms. Experiment 2 induced specific combinations of joint motion (the same for all participants). Again, peak joint displacements, return times, end point accuracy, and the directional tuning and amplitude of stretch reflexes in nearly all muscles did not differ statistically when countering the imposed loads with each arm. Moderate to strong correlations were found between stretchmore »reflexes and behavioral responses to the perturbations with the two arms across both experiments. Collectively, the results do not support the idea that the dominant arm specializes in exploiting internal models and the nondominant arm in impedance control by increasing reflex gains to counter sudden loads imposed on the arms during posture control. NEW & NOTEWORTHY A prominent hypothesis is that the nervous system controls the dominant arm through predictive internal models and the nondominant arm through impedance mechanisms. We tested whether stretch reflexes of muscles in the two arms also display such specialization during posture control. Nearly all behavioral responses and stretch reflexes did not differ statistically but were strongly correlated between the arms. The results indicate individual signatures of feedback control that are common for the two arms.« less
  5. It is imperative that an animal have the ability to track its own motion within its immediate surroundings. It gives the necessary basis for decision making that leads to appropriate behavioral responses. It is our goal to implement insect-like body tracking capabilities into a robotic controller and have this serve as the first step toward adaptive robotic behavior. In an attempt to tackle the first step of body tracking without GPS or other external information, we have turned to arthropod neurophysiology as inspiration. The insect brain structure called the central complex (CX) is thought to be vital for sensory integration and body position tracking. The mechanisms behind sensory integration are immensely complex, but it was found to be done with an elegant neuronal architecture. Based on this architecture, we assembled a dynamical neural model of the functional core of the central complex, two structures called the protocerebral bridge and the ellipsoid body, in a simulation environment. Using non-spiking neuronal dynamics, our simulation was able to recreate in vivo behavior such as correlating body rotation direction and speed to activity bump dynamics within the ellipsoid body of the central complex. This model serves as the first step towards using idiothetic cuesmore »to track body position and orientation determination, which is critical for homing after exploring new environments and other navigational tasks.« less