skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Room Size Perception in Virtual Reality by Means of Sound and Vision: The Role of Perception-Action Calibration
Spatial perception in virtual reality (VR) has been a hot research topic for years. Most of the studies on this topic have focused on visual perception and distance perception. Fewer have examined auditory perception and room size perception, although these aspects are important for improving VR experiences. Recently, a number of studies have shown that perception can be calibrated to information that is relevant to the successful completion of everyday tasks in VR (such as distance estimation and spatial perception). Also, some recent studies have examined calibration of auditory perception as a way to compensate for the classic distance compression problem in VR. In this paper, we present a calibration method for both visual and auditory room size perception. We conducted experiments to investigate how people perceive the size of a virtual room and how the accuracy of their size perception can be calibrated by manipulating perceptible auditory and visual information in VR. The results show that people were more accurate in perceiving room size by means of vision than in audition, but that they could still use audition to perceive room size. The results also show that during calibration, auditory room size perception exhibits learning effects and its accuracy was greatly improved after calibration.  more » « less
Award ID(s):
2007435
PAR ID:
10621242
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3315-1647-5
Page Range / eLocation ID:
426 to 435
Subject(s) / Keyword(s):
Auditory Room Size Perception Calibration VR
Format(s):
Medium: X
Location:
Bellevue, WA, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. Distance compression, which refers to the underestimation of ego-centric distance to objects, is a common problem in immersive virtual environments. Besides visually compensating the compressed distance, several studies have shown that auditory information can be an alternative solution for this problem. In particular, reverberation time (RT) has been proven to be an effective method to compensate distance compression. To further explore the feasibility of applying audio information to improve distance perception, we investigate whether users’ egocentric distance perception can be calibrated, and whether the calibrated effect can be carried over and even sustain for a longer duration. We conducted a study to understand the perceptual learning and carryover effects by using RT as stimuli for users to perceive distance in IVEs. The results show that the carryover effect exists after calibration, which indicates people can learn to perceive distances by attuning reverberation time, and the accuracy even remains a constant level after 6 months. Our findings could potentially be utilized to improve the distance perception in VR systems as the calibration of auditory distance perception in VR could sustain for several months. This could eventually avoid the burden of frequent training regimens. 
    more » « less
  2. The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth. 
    more » « less
  3. As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective. 
    more » « less
  4. As virtual reality (VR) technology sees more use in various fields, there is a greater need to understand how to effectively design dynamic virtual environments. As of now, there is still uncertainty in how well users of a VR system are capable of tracking moving targets in a virtual space. In this work, we examined the influence of sensory modality and visual feedback on the accuracy of head-gaze moving target tracking. To this end, a between subjects study was conducted wherein participants would receive targets that were visual, auditory, or audiovisual. Each participant performed two blocks of experimental trials, with a calibration block in between. Results indicate that audiovisual targets promoted greater improvement in tracking performance over single-modality targets, and that audio-only targets are more difficult to track than those of other modalities. 
    more » « less
  5. null (Ed.)
    Technological advancements and increased access have prompted the adoption of head- mounted display based virtual reality (VR) for neuroscientific research, manual skill training, and neurological rehabilitation. Applications that focus on manual interaction within the virtual environment (VE), especially haptic-free VR, critically depend on virtual hand-object collision detection. Knowledge about how multisensory integration related to hand-object collisions affects perception-action dynamics and reach-to-grasp coordination is needed to enhance the immersiveness of interactive VR. Here, we explored whether and to what extent sensory substitution for haptic feedback of hand-object collision (visual, audio, or audiovisual) and collider size (size of spherical pointers representing the fingertips) influences reach-to-grasp kinematics. In Study 1, visual, auditory, or combined feedback were compared as sensory substitutes to indicate the successful grasp of a virtual object during reach-to-grasp actions. In Study 2, participants reached to grasp virtual objects using spherical colliders of different diameters to test if virtual collider size impacts reach-to-grasp. Our data indicate that collider size but not sensory feedback modality significantly affected the kinematics of grasping. Larger colliders led to a smaller size-normalized peak aperture. We discuss this finding in the context of a possible influence of spherical collider size on the perception of the virtual object’s size and hence effects on motor planning of reach-to-grasp. Critically, reach-to-grasp spatiotemporal coordination patterns were robust to manipulations of sensory feedback modality and spherical collider size, suggesting that the nervous system adjusted the reach (transport) component commensurately to the changes in the grasp (aperture) component. These results have important implications for research, commercial, industrial, and clinical applications of VR. 
    more » « less