skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on March 8, 2026

Title: Comparing Absolute Size Perception in Optical See-Through Augmented Reality and Real World Viewing Using Verbal and Physical Judgments
This empirical evaluation aimed to investigate how size perception differs between OST AR and the real world, focusing on two judgment methods: verbal reports and physical judgments. Using a within-subjects experimental design, participants viewed target objects in different sizes in both AR and real-world conditions and estimated their sizes using verbal and physical judgment methods across multiple trials. The study addressed two key hypotheses: (H1) that size perception in AR would differ from the Real World, potentially due to rendering limitations in OST-HMDs, and (H2) that verbal reports and physical judgments would yield different levels of accuracy due to distinct cognitive and perceptual processes involved in each method. Our findings supported these hypotheses, revealing key differences in size perception between the two judgment methods and viewing conditions. Participants consistently underestimated object sizes when using verbal reports in both AR and real-world conditions, with more pronounced errors in AR. In contrast, physical judgments yielded more accurate size estimates under both viewing conditions. Notably, the accuracy of verbal reports decreased as target sizes increased, a trend that was particularly evident in AR. These results underscore the perceptual challenges associated with verbal size judgments in AR and their potential limitations in applications requiring precise size estimations. By highlighting the differences in accuracy and consistency between verbal and physical judgment methods, this study contributes to a deeper understanding of size perception in OST AR and real-world contexts.  more » « less
Award ID(s):
2007435
PAR ID:
10621239
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
IEEE
Date Published:
ISBN:
979-8-3315-3645-9
Page Range / eLocation ID:
307 to 317
Subject(s) / Keyword(s):
Augmented Reality Absolute Size Perception Perception-action User Studies
Format(s):
Medium: X
Location:
Saint Malo, France
Sponsoring Org:
National Science Foundation
More Like this
  1. Many AR applications require users to perceive, estimate and calibrate to the size of objects presented in the scene. Distortions in size perception in AR could potentially influence the effectiveness of skills transferred from the AR to the real world. We investigated the after-effects or carry-over effects of calibration of size perception in AR to the real world (RW), by providing feedback and an opportunity for participants to correct their judgments in AR. In an empirical evaluation, we employed a three-phase experiment design. In the pretest phase, participants made size estimations to target objects concurrently using both verbal reports and physical judgment in RW as a baseline. Then, they estimated the size of targets, and then were provided with feedback and subsequently corrected their judgments in a calibration phase. Followed by which, participants made size estimates to target objects in the real world. Our findings revealed that the carryover effects of calibration successfully transferred from AR to RW in both verbal reports and physical judgment methods. 
    more » « less
  2. Cho, Isaac; Hoermann, Simon; Krösl, Katharina; Zielasko, Daniel; Cidota, Marina (Ed.)
    An important research question in optical see-through (OST) augmented reality (AR) is, how accurately and precisely can a virtual object’s real world location be perceived? Previously, a method was developed to measure the perceived three-dimensional location of virtual objects in OST AR. In this research, a replication study is reported, which examined whether the perceived location of virtual objects are biased in the direction of the dominant eye. The successful replication analysis suggests that perceptual accuracy is not biased in the direction of the dominant eye. Compared to the previous study’s findings, overall perceptual accuracy increased, and precision was similar. 
    more » « less
  3. Rogowitz, Bernice E; Pappas, Thrasyvoulos N (Ed.)
    Augmented reality (AR) combines elements of the real world with additional virtual content, creating a blended viewing environment. Optical see-through AR (OST-AR) accomplishes this by using a transparent beam splitter to overlay virtual elements over a user’s view of the real world. However, the inherent see-through nature of OST-AR carries challenges for color appearance, especially around the appearance of darker and less chromatic objects. When displaying human faces—a promising application of AR technology—these challenges disproportionately affect darker skin tones, making them appear more transparent than lighter skin tones. Still, some transparency in the rendered object may not be entirely negative; people’s evaluations of transparency when interacting with other humans in AR-mediated modalities are not yet fully understood. In this work, two psychophysical experiments were conducted to assess how people evaluate OST-AR transparency across several characteristics including different skin tones, object types, lighting conditions, and display types. The results provide a scale of perceived transparency allowing comparisons to transparency for conventional emissive displays. The results also demonstrate how AR transparency impacts perceptions of object preference and fit within the environment. These results reveal several areas with need for further attention, particularly regarding darker skin tones, lighter ambient lighting, and displaying human faces more generally. This work may be useful in guiding the development of OST-AR technology, and emphasizes the importance of AR design goals, perception of human faces, and optimizing visual appearance in extended reality systems. 
    more » « less
  4. Each view of our environment captures only a subset of our immersive surroundings. Yet, our visual experience feels seamless. A puzzle for human neuroscience is to determine what cognitive mechanisms enable us to overcome our limited field of view and efficiently anticipate new views as we sample our visual surroundings. Here, we tested whether memory-based predictions of upcoming scene views facilitate efficient perceptual judgments across head turns. We tested this hypothesis using immersive, head-mounted virtual reality (VR). After learning a set of immersive real-world environments, participants (n = 101 across 4 experiments) were briefly primed with a single view from a studied environment and then turned left or right to make a perceptual judgment about an adjacent scene view. We found that participants’ perceptual judgments were faster when they were primed with images from the same (vs. neutral or different) environments. Importantly, priming required memory: it only occurred in learned (vs. novel) environments, where the link between adjacent scene views was known. Further, consistent with a role in supporting active vision, priming only occurred in the direction of planned head turns and only benefited judgments for scene views presented in their learned spatiotopic positions. Taken together, we propose that memory-based predictions facilitate rapid perception across large-scale visual actions, such as head and body movements, and may be critical for efficient behavior in complex immersive environments. 
    more » « less
  5. Optical see-through Augmented Reality (OST-AR) is a developing technology with exciting applications including medicine, industry, education, and entertainment. OST-AR creates a mix of virtual and real using an optical combiner that blends images and graphics with the real-world environment. Such an overlay of visual information is simultaneously futuristic and familiar: like the sci-fi navigation and communication interfaces in movies, but also much like banal reflections in glass windows. OSTAR’s transparent displays cause background bleed-through, which distorts color and contrast, yet virtual content is usually easily understandable. Perceptual scission, or the cognitive separation of layers, is an important mechanism, influenced by transparency, depth, parallax, and more, that helps us see what is real and what is virtual. In examples from Pepper’s Ghost, veiling luminance, mixed material modes, window shopping, and today’s OST-AR systems, transparency and scission provide surprising – and ordinary – results. Ongoing psychophysical research is addressing perceived characteristics of color, material, and images in OST-AR, testing and harnessing the perceptual effects of transparency and scission. Results help both understand the visual mechanisms and improve tomorrow’s AR systems. 
    more » « less