This research investigates the effect of scaling in virtual reality to improve the reach of users with Parkinson’s disease (PD). People with PD have limited reach, often due to impaired postural stability. We investigated how virtual reality (VR) can improve reach during and after VR exposure. Participants played a VR game where they smashed water balloons thrown at them by crossing their midsection. The distance the balloons were thrown at increased and decreased based on success or failure. Their perception of the distance and their hand were scaled in three counterbalanced conditions: under-scaled (scale = 0:83), not-scaled (scale = 1), and over-scaled (scale = 1:2), where the scale value is the ratio between the virtual reach that they perceive in the virtual environment (VE) and their actual reach. In each study condition, six data were measured - 1. Real World Reach (pre-exposure), 2. Virtual Reality Baseline Reach, 3. Virtual Reality Not-Scaled Reach, 4. Under-Scaled Reach, 5. Over-Scaled Reach, and 6. Real World Reach (post-exposure). Our results show that scaling a person’s movement in virtual reality can help improve reach. Therefore, we recommend including a scaling factor in VR games for people with Parkinson’s disease.
more »
« less
Perceiving distance in virtual reality: theoretical insights from contemporary technologies
Decades of research have shown that absolute egocentric distance is underestimated in virtual environments (VEs) when compared with the real world. This finding has implications on the use of VEs for applications that require an accurate sense of absolute scale. Fortunately, this underperception of scale can be attenuated by several factors, making perception more similar to (but still not the same as) that of the real world. Here, we examine these factors as two categories: (i) experience inherent to the observer, and (ii) characteristics inherent to the display technology. We analyse how these factors influence the sources of information for absolute distance perception with the goal of understanding how the scale of virtual spaces is calibrated. We identify six types of cues that change with these approaches, contributing both to a theoretical understanding of depth perception in VEs and a call for future research that can benefit from changing technologies. This article is part of the theme issue ‘New approaches to 3D vision’.
more »
« less
- PAR ID:
- 10478507
- Publisher / Repository:
- Creem-Regehr et al. 2023
- Date Published:
- Journal Name:
- Philosophical Transactions of the Royal Society B: Biological Sciences
- Volume:
- 378
- Issue:
- 1869
- ISSN:
- 0962-8436
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth.more » « less
-
Recent research has used virtual environments (VEs), as presented via virtual reality (VR) headsets, to study human behavior in hypothetical fire scenarios. One goal of using VEs in fire scenarios is to elicit patterns of behavior which more closely align to how individuals would react to real fire emergency situations. The present study investigated whether elicited behaviors and perceived risk varied during fire scenarios presented as VEs via two viewing conditions. These included a VR condition, where the VE was rendered as 360-degree videos presented in a VR headset, and a screen condition, where VEs were rendered as fixed-view videos via a computer monitor screen. We predicted that the selection of actions during the scenario would vary between conditions, that participants would rate fires as more dangerous if they developed more quickly and when smoke was rendered as thicker, and that participants would report greater levels of immersion in the VR condition. A total of 159 participants completed a decision-making task where they viewed videos of an incipient fire in a residential building and judged what action to take. Initial action responses to the fire scenarios varied between both viewing and smoke conditions, with those assigned to the thicker smoke and screen conditions being more likely to take protective action. Risk ratings also varied by smoke condition, with evidence of higher perceived risk for thicker smoke. Several factors of self-reported immersion (namely ‘interest’, ‘emotional attachment’, ‘focus of attention’, and ‘flow’) were associated with risk ratings, with perceived presence associated with initial actions. The present study provides evidence that enhancing immersion and perceived risk in a VE contributes to a different pattern of behaviors during simulated fire decision-making tasks. While our investigation only addressed the ideas of presence in an environment, future research should investigate the relative contribution of interactivity and consequences within the environment to further identify how behaviors during simulated fire scenarios are affected by each of these factors.more » « less
-
Spatial perception in virtual reality (VR) has been a hot research topic for years. Most of the studies on this topic have focused on visual perception and distance perception. Fewer have examined auditory perception and room size perception, although these aspects are important for improving VR experiences. Recently, a number of studies have shown that perception can be calibrated to information that is relevant to the successful completion of everyday tasks in VR (such as distance estimation and spatial perception). Also, some recent studies have examined calibration of auditory perception as a way to compensate for the classic distance compression problem in VR. In this paper, we present a calibration method for both visual and auditory room size perception. We conducted experiments to investigate how people perceive the size of a virtual room and how the accuracy of their size perception can be calibrated by manipulating perceptible auditory and visual information in VR. The results show that people were more accurate in perceiving room size by means of vision than in audition, but that they could still use audition to perceive room size. The results also show that during calibration, auditory room size perception exhibits learning effects and its accuracy was greatly improved after calibration.more » « less
-
Funt, Brian; Kingsburgh, Robin (Ed.)Optical see-through AR presents virtual objects to a user through a transparent display that blends them with the real-world environment. This is simultaneously novel and familiar: beam splitters have been used for ghostly visual effects, and yet the mechanism is exactly the same as the reflections in an everyday window. The history of theatrical visual effects leads through a series of vision science experiments and now to research on the perception of transparent AR systems. Still, there is a tension in the perception of AR stimuli: users of AR seem to be able to separate, or scission, the layers of virtual and real, depending on their understanding of the scene and its visual characteristics.more » « less
An official website of the United States government

