Spatial perception in virtual reality (VR) has been a hot research topic for years. Most of the studies on this topic have focused on visual perception and distance perception. Fewer have examined auditory perception and room size perception, although these aspects are important for improving VR experiences. Recently, a number of studies have shown that perception can be calibrated to information that is relevant to the successful completion of everyday tasks in VR (such as distance estimation and spatial perception). Also, some recent studies have examined calibration of auditory perception as a way to compensate for the classic distance compression problem in VR. In this paper, we present a calibration method for both visual and auditory room size perception. We conducted experiments to investigate how people perceive the size of a virtual room and how the accuracy of their size perception can be calibrated by manipulating perceptible auditory and visual information in VR. The results show that people were more accurate in perceiving room size by means of vision than in audition, but that they could still use audition to perceive room size. The results also show that during calibration, auditory room size perception exhibits learning effects and its accuracy was greatly improved after calibration.
more »
« less
An Empirical Evaluation of the Calibration of Auditory Distance Perception under Different Levels of Virtual Environment Visibilities
The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth.
more »
« less
- Award ID(s):
- 2007435
- PAR ID:
- 10527850
- Publisher / Repository:
- IEEE
- Date Published:
- ISSN:
- 2642-5254
- ISBN:
- 979-8-3503-7402-5
- Page Range / eLocation ID:
- 690 to 700
- Format(s):
- Medium: X
- Location:
- Orlando, FL, USA
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Distance compression, which refers to the underestimation of ego-centric distance to objects, is a common problem in immersive virtual environments. Besides visually compensating the compressed distance, several studies have shown that auditory information can be an alternative solution for this problem. In particular, reverberation time (RT) has been proven to be an effective method to compensate distance compression. To further explore the feasibility of applying audio information to improve distance perception, we investigate whether users’ egocentric distance perception can be calibrated, and whether the calibrated effect can be carried over and even sustain for a longer duration. We conducted a study to understand the perceptual learning and carryover effects by using RT as stimuli for users to perceive distance in IVEs. The results show that the carryover effect exists after calibration, which indicates people can learn to perceive distances by attuning reverberation time, and the accuracy even remains a constant level after 6 months. Our findings could potentially be utilized to improve the distance perception in VR systems as the calibration of auditory distance perception in VR could sustain for several months. This could eventually avoid the burden of frequent training regimens.more » « less
-
As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective.more » « less
-
Interpupillary distance (IPD) is the most important parameter for creating a user-specific stereo parallax, which in turn is crucial for correct depth perception. This is why contemporary Head-Mounted Displays (HMDs) offer adjustable lenses to adapt to users’ individual IPDs. However, today’s Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user’s environment. This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which can lead to perceptual incongruencies, limiting the usability and, potentially, the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2 × 3 mixed-factor design user study using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also investigated changes in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, an effect of IPD-ICD mismatch, and a combined effect of both factors. However, subjective measures showed no effect underlining the subconscious nature of the perception of VST AR. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases.more » « less
-
As virtual reality (VR) technology sees more use in various fields, there is a greater need to understand how to effectively design dynamic virtual environments. As of now, there is still uncertainty in how well users of a VR system are capable of tracking moving targets in a virtual space. In this work, we examined the influence of sensory modality and visual feedback on the accuracy of head-gaze moving target tracking. To this end, a between subjects study was conducted wherein participants would receive targets that were visual, auditory, or audiovisual. Each participant performed two blocks of experimental trials, with a calibration block in between. Results indicate that audiovisual targets promoted greater improvement in tracking performance over single-modality targets, and that audio-only targets are more difficult to track than those of other modalities.more » « less
An official website of the United States government

