skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Influence of Visual Augmented Feedback on Walking Speed Perception in Immersive Virtual Reality
Abstract In virtual reality (VR), established perception–action relationships break down because of conflicting and ambiguous sensorimotor inputs, inducing walking velocity underestimations. Here, we explore the effects of realigning perceptual sensory experiences with physical movements via augmented feedback on the estimation of virtual speed. We hypothesized that providing feedback about speed would lead to concurrent perceptual improvements and that these alterations would persist once the speedometer was removed. Ten young adults used immersive VR to view a virtual hallway translating at a series of fixed speeds. Participants were tasked with matching their walking speed on a self-paced treadmill to the optic flow in the environment. Information regarding walking speed accuracy was provided during augmented feedback trials via a real-time speedometer. We measured resulting walking velocity errors, as well as kinematic gait parameters. We found that the concordance between the virtual environment and gait speeds was higher when augmented feedback was provided during the trial. Furthermore, we observed retention effects beyond the intervention period via demonstrated smaller errors in speed perception accuracy and stronger concordance between perceived and actual speeds. Together, these results highlight a potential role for augmented feedback in guiding gait strategies that deviate away from predefined internal models of locomotion.  more » « less
Award ID(s):
2239760
PAR ID:
10515367
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
MIT Press
Date Published:
Journal Name:
PRESENCE: Virtual and Augmented Reality
Volume:
32
ISSN:
1531-3263
Page Range / eLocation ID:
53 to 64
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth. 
    more » « less
  2. As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective. 
    more » « less
  3. Walking through immersive virtual environments is one of the important parts of Virtual Reality (VR) applications. Prior research has established that users’ gait in virtual and real environments differs; however, little research has evaluated how users’ gait differs as users gain more experience with VR. We conducted experiments measuring novice and experienced subjects’ gait parameters in VR and real environments. Results showed that subjects’ performance in VR and Real World was more similar in the last trials than in the first trials; their walking dissimilarity in the start trials diminished by walking more trials. We found trial as a significant variable affecting the walking speed, step length, and trunk angle for both groups of users. While the main effect of expertise was not observed, an interaction effect between expertise and the trial number was shown. Trunk angle increased over time for novices but decreased for experts. 
    more » « less
  4. null (Ed.)
    Background: Soft robotic exosuits can facilitate immediate increases in short- and long-distance walking speeds in people with post-stroke hemiparesis. We sought to assess the feasibility and rehabilitative potential of applying propulsion-augmenting exosuits as part of an individualized and progressive training program to retrain faster walking and the underlying propulsive strategy. Methods: A 54-yr old male with chronic hemiparesis completed five daily sessions of Robotic Exosuit Augmented Locomotion (REAL) gait training. REAL training consists of high-intensity, task-specific, and progressively challenging walking practice augmented by a soft robotic exosuit and is designed to facilitate faster walking by way of increased paretic propulsion. Repeated baseline assessments of comfortable walking speed over a 2-year period provided a stable baseline from which the effects of REAL training could be elucidated. Additional outcomes included paretic propulsion, maximum walking speed, and 6-minute walk test distance. Results: Comfortable walking speed was stable at 0.96 m/s prior to training and increased by 0.30 m/s after training. Clinically meaningful increases in maximum walking speed (Δ: 0.30 m/s) and 6-minute walk test distance (Δ: 59 m) were similarly observed. Improvements in paretic peak propulsion (Δ: 2.80 %BW), propulsive power (Δ: 0.41 W/kg), and trailing limb angle (Δ: 6.2 degrees) were observed at comfortable walking speed ( p 's < 0.05). Likewise, improvements in paretic peak propulsion (Δ: 4.63 %BW) and trailing limb angle (Δ: 4.30 degrees) were observed at maximum walking speed ( p 's < 0.05). Conclusions: The REAL training program is feasible to implement after stroke and capable of facilitating rapid and meaningful improvements in paretic propulsion, walking speed, and walking distance. 
    more » « less
  5. Interpupillary distance (IPD) is the most important parameter for creating a user-specific stereo parallax, which in turn is crucial for correct depth perception. This is why contemporary Head-Mounted Displays (HMDs) offer adjustable lenses to adapt to users’ individual IPDs. However, today’s Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user’s environment. This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which can lead to perceptual incongruencies, limiting the usability and, potentially, the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2 × 3 mixed-factor design user study using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also investigated changes in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, an effect of IPD-ICD mismatch, and a combined effect of both factors. However, subjective measures showed no effect underlining the subconscious nature of the perception of VST AR. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases. 
    more » « less