Many AR applications require users to perceive, estimate and calibrate to the size of objects presented in the scene. Distortions in size perception in AR could potentially influence the effectiveness of skills transferred from the AR to the real world. We investigated the after-effects or carry-over effects of calibration of size perception in AR to the real world (RW), by providing feedback and an opportunity for participants to correct their judgments in AR. In an empirical evaluation, we employed a three-phase experiment design. In the pretest phase, participants made size estimations to target objects concurrently using both verbal reports and physical judgment in RW as a baseline. Then, they estimated the size of targets, and then were provided with feedback and subsequently corrected their judgments in a calibration phase. Followed by which, participants made size estimates to target objects in the real world. Our findings revealed that the carryover effects of calibration successfully transferred from AR to RW in both verbal reports and physical judgment methods.
more »
« less
Calibrated Passability Perception in Virtual Reality Transfers to Augmented Reality
As applications for virtual reality (VR) and augmented reality (AR) technology increase, it will be important to understand how users perceive their action capabilities in virtual environments. Feedback about actions may help to calibrate perception for action opportunities (affordances) so that action judgments in VR and AR mirror actors’ real abilities. Previous work indicates that walking through a virtual doorway while wielding an object can calibrate the perception of one’s passability through feedback from collisions. In the current study, we aimed to replicate this calibration through feedback using a different paradigm in VR while also testing whether this calibration transfers to AR. Participants held a pole at 45°and made passability judgments in AR (pretest phase). Then, they made passability judgments in VR and received feedback on those judgments by walking through a virtual doorway while holding the pole (calibration phase). Participants then returned to AR to make posttest passability judgments. Results indicate that feedback calibrated participants’ judgments in VR. Moreover, this calibration transferred to the AR environment. In other words, after experiencing feedback in VR, passability judgments in VR and in AR became closer to an actor’s actual ability, which could make training applications in these technologies more effective.
more »
« less
- Award ID(s):
- 1763254
- PAR ID:
- 10478503
- Publisher / Repository:
- Gagnon et al. 2023
- Date Published:
- Journal Name:
- ACM Transactions on Applied Perception
- Volume:
- 20
- Issue:
- 4
- ISSN:
- 1544-3558
- Page Range / eLocation ID:
- 1 to 16
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Today, augmented reality (AR) is most easily experienced through a mobile device such as a modern smartphone. For AR to be useful for applications such as training, it is important to understand how people perceive interactions with virtual objects presented to them via mobile AR. In this paper, we investigated two judgments of action capabilities (affordances) with virtual objects presented through smartphones: passing through an aperture and stepping over a gap. Our goals were to 1) determine if people can reliably scale these judgments to their body dimensions or capabilities and 2) explore whether cues presented in the context of the action could change their judgments. Assessments of perceived action capabilities were made in a pre/post-test design in which observers judged their affordances towards virtual objects prior to seeing an AR cue denoting their body dimension/capability, while viewing the cue, and after seeing the cue. Different patterns of results were found for the two affordances. For passing through, estimates became closer to shoulder width in the post-cue compared to the pre-cue block. For gap stepping, estimates were closer to actual stepping capability while viewing the cue, but did not persist when the cue was no longer present. Overall, our findings show that mobile smartphones can be used to assess perceived action capabilities with virtual targets and that AR cues can influence the perception of action capabilities in these devices. Our work provides a foundation for future studies investigating perception with the use of mobile AR with smartphones.more » « less
-
Abstract In virtual reality (VR), established perception–action relationships break down because of conflicting and ambiguous sensorimotor inputs, inducing walking velocity underestimations. Here, we explore the effects of realigning perceptual sensory experiences with physical movements via augmented feedback on the estimation of virtual speed. We hypothesized that providing feedback about speed would lead to concurrent perceptual improvements and that these alterations would persist once the speedometer was removed. Ten young adults used immersive VR to view a virtual hallway translating at a series of fixed speeds. Participants were tasked with matching their walking speed on a self-paced treadmill to the optic flow in the environment. Information regarding walking speed accuracy was provided during augmented feedback trials via a real-time speedometer. We measured resulting walking velocity errors, as well as kinematic gait parameters. We found that the concordance between the virtual environment and gait speeds was higher when augmented feedback was provided during the trial. Furthermore, we observed retention effects beyond the intervention period via demonstrated smaller errors in speed perception accuracy and stronger concordance between perceived and actual speeds. Together, these results highlight a potential role for augmented feedback in guiding gait strategies that deviate away from predefined internal models of locomotion.more » « less
-
Interpupillary distance (IPD) is the most important parameter for creating a user-specific stereo parallax, which in turn is crucial for correct depth perception. This is why contemporary Head-Mounted Displays (HMDs) offer adjustable lenses to adapt to users’ individual IPDs. However, today’s Video See-Through Augmented Reality (VST AR) HMDs use fixed camera placements to reconstruct the stereoscopic view of a user’s environment. This leads to a potential mismatch between individual IPD settings and the fixed Inter-Camera Distances (ICD), which can lead to perceptual incongruencies, limiting the usability and, potentially, the applicability of VST AR in depth-sensitive use cases. To investigate this incongruency between IPD and ICD, we conducted a 2 × 3 mixed-factor design user study using a near-field, open-loop reaching task comparing distance judgments of Virtual Reality (VR) and VST AR. We also investigated changes in reaching performance via perceptual calibration by incorporating a feedback phase between pre- and post-phase conditions, with a particular focus on the influence of IPD-ICD differences. Our Linear Mixed Model (LMM) analysis showed a significant difference between VR and VST AR, an effect of IPD-ICD mismatch, and a combined effect of both factors. However, subjective measures showed no effect underlining the subconscious nature of the perception of VST AR. This novel insight and its consequences are discussed specifically for depth perception tasks in AR, eXtended Reality (XR), and potential use cases.more » « less
-
The perception of distance is a complex process that often involves sensory information beyond that of just vision. In this work, we investigated if depth perception based on auditory information can be calibrated, a process by which perceptual accuracy of depth judgments can be improved by providing feedback and then performing corrective actions. We further investigated if perceptual learning through carryover effects of calibration occurs in different levels of a virtual environment’s visibility based on different levels of virtual lighting. Users performed an auditory depth judgment task over several trials in which they walked where they perceived an aural sound to be, yielding absolute estimates of perceived distance. This task was performed in three sequential phases: pretest, calibration, posttest. Feedback on the perceptual accuracy of distance estimates was only provided in the calibration phase, allowing to study the calibration of auditory depth perception. We employed a 2 (Visibility of virtual environment) ×3 (Phase) ×5 (Target Distance) multi-factorial design, manipulating the phase and target distance as within-subjects factors, and the visibility of the virtual environment as a between-subjects factor. Our results revealed that users generally tend to underestimate aurally perceived distances in VR similar to the distance compression effects that commonly occur in visual distance perception in VR. We found that auditory depth estimates, obtained using an absolute measure, can be calibrated to become more accurate through feedback and corrective action. In terms of environment visibility, we find that environments visible enough to reveal their extent may contain visual information that users attune to in scaling aurally perceived depth.more » « less
An official website of the United States government

