skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Investigating the Carryover Effects of Calibration of Size Perception in Augmented Reality to the Real World
Many AR applications require users to perceive, estimate and calibrate to the size of objects presented in the scene. Distortions in size perception in AR could potentially influence the effectiveness of skills transferred from the AR to the real world. We investigated the after-effects or carry-over effects of calibration of size perception in AR to the real world (RW), by providing feedback and an opportunity for participants to correct their judgments in AR. In an empirical evaluation, we employed a three-phase experiment design. In the pretest phase, participants made size estimations to target objects concurrently using both verbal reports and physical judgment in RW as a baseline. Then, they estimated the size of targets, and then were provided with feedback and subsequently corrected their judgments in a calibration phase. Followed by which, participants made size estimates to target objects in the real world. Our findings revealed that the carryover effects of calibration successfully transferred from AR to RW in both verbal reports and physical judgment methods.  more » « less
Award ID(s):
2007435
PAR ID:
10621240
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3315-1647-5
Page Range / eLocation ID:
806 to 815
Subject(s) / Keyword(s):
Augmented Reality Size Perception Perceptuomotor Calibration Perception-Action Empirical Evaluation
Format(s):
Medium: X
Location:
Bellevue, WA, USA
Sponsoring Org:
National Science Foundation
More Like this
  1. This empirical evaluation aimed to investigate how size perception differs between OST AR and the real world, focusing on two judgment methods: verbal reports and physical judgments. Using a within-subjects experimental design, participants viewed target objects in different sizes in both AR and real-world conditions and estimated their sizes using verbal and physical judgment methods across multiple trials. The study addressed two key hypotheses: (H1) that size perception in AR would differ from the Real World, potentially due to rendering limitations in OST-HMDs, and (H2) that verbal reports and physical judgments would yield different levels of accuracy due to distinct cognitive and perceptual processes involved in each method. Our findings supported these hypotheses, revealing key differences in size perception between the two judgment methods and viewing conditions. Participants consistently underestimated object sizes when using verbal reports in both AR and real-world conditions, with more pronounced errors in AR. In contrast, physical judgments yielded more accurate size estimates under both viewing conditions. Notably, the accuracy of verbal reports decreased as target sizes increased, a trend that was particularly evident in AR. These results underscore the perceptual challenges associated with verbal size judgments in AR and their potential limitations in applications requiring precise size estimations. By highlighting the differences in accuracy and consistency between verbal and physical judgment methods, this study contributes to a deeper understanding of size perception in OST AR and real-world contexts. 
    more » « less
  2. As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective. 
    more » « less
  3. The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth. 
    more » « less
  4. Impossible spaces have been used to increase the amount of virtual space available for real walking within a constrained physical space. In this technique, multiple virtual rooms are allowed to occupy overlapping portions of the physical space, in a way which is not possible in real euclidean space. Prior work has explored detection thresholds for impossible spaces, however very little work has considered other aspects of how impossible spaces alter participants' perception of spatial relationships within virtual environments. In this paper, we present a within-subjects study $(n=30)$ investigating how impossible spaces altered participants perceptions of the location of objects placed in different rooms. Participants explored three layouts with varying amounts of overlap between rooms and then pointed in the direction of various objects they had been tasked to locate. Significantly more error was observed when pointing at objects in overlapping spaces as compared to the non-overlapping layout. Further analysis suggests that participants pointed towards where objects would be located in the non-overlapping layout, regardless of how much overlap was present. This suggests that, when participants are not aware that any manipulation is present, they automatically adapt their representation of the spaces based on judgments of relative size and visible constraints on the size of the whole system. 
    more » « less
  5. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less