skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: A Replication Study to Measure the Perceived Three-Dimensional Location of Virtual Objects in Optical See Through Augmented Reality
An important research question in optical see-through (OST) augmented reality (AR) is, how accurately and precisely can a virtual object’s real world location be perceived? Previously, a method was developed to measure the perceived three-dimensional location of virtual objects in OST AR. In this research, a replication study is reported, which examined whether the perceived location of virtual objects are biased in the direction of the dominant eye. The successful replication analysis suggests that perceptual accuracy is not biased in the direction of the dominant eye. Compared to the previous study’s findings, overall perceptual accuracy increased, and precision was similar.  more » « less
Award ID(s):
1937565
PAR ID:
10390190
Author(s) / Creator(s):
; ; ;
Editor(s):
Cho, Isaac; Hoermann, Simon; Krösl, Katharina; Zielasko, Daniel; Cidota, Marina
Date Published:
Journal Name:
IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops
Page Range / eLocation ID:
796 to 797
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Optical see-through Augmented Reality (OST-AR) is a developing technology with exciting applications including medicine, industry, education, and entertainment. OST-AR creates a mix of virtual and real using an optical combiner that blends images and graphics with the real-world environment. Such an overlay of visual information is simultaneously futuristic and familiar: like the sci-fi navigation and communication interfaces in movies, but also much like banal reflections in glass windows. OSTAR’s transparent displays cause background bleed-through, which distorts color and contrast, yet virtual content is usually easily understandable. Perceptual scission, or the cognitive separation of layers, is an important mechanism, influenced by transparency, depth, parallax, and more, that helps us see what is real and what is virtual. In examples from Pepper’s Ghost, veiling luminance, mixed material modes, window shopping, and today’s OST-AR systems, transparency and scission provide surprising – and ordinary – results. Ongoing psychophysical research is addressing perceived characteristics of color, material, and images in OST-AR, testing and harnessing the perceptual effects of transparency and scission. Results help both understand the visual mechanisms and improve tomorrow’s AR systems. 
    more » « less
  2. For optical see-through augmented reality (AR), a new method for measuring the perceived three-dimensional location of virtual objects is presented, where participants verbally report a virtual object’s location relative to both a vertical and horizontal grid. The method is tested with a small (1.95 × 1.95 × 1.95 cm) virtual object at distances of 50 to 80 cm, viewed through a Microsoft HoloLens 1 st generation AR display. Two experiments examine two different virtual object designs, whether turning in a circle between reported object locations disrupts HoloLens tracking, and whether accuracy errors, including a rightward bias and underestimated depth, might be due to systematic errors that are restricted to a particular display. Turning in a circle did not disrupt HoloLens tracking, and testing with a second display did not suggest systematic errors restricted to a particular display. Instead, the experiments are consistent with the hypothesis that, when looking downwards at a horizontal plane, HoloLens 1 st generation displays exhibit a systematic rightward perceptual bias. Precision analysis suggests that the method could measure the perceived location of a virtual object within an accuracy of less than 1 mm. 
    more » « less
  3. This empirical evaluation aimed to investigate how size perception differs between OST AR and the real world, focusing on two judgment methods: verbal reports and physical judgments. Using a within-subjects experimental design, participants viewed target objects in different sizes in both AR and real-world conditions and estimated their sizes using verbal and physical judgment methods across multiple trials. The study addressed two key hypotheses: (H1) that size perception in AR would differ from the Real World, potentially due to rendering limitations in OST-HMDs, and (H2) that verbal reports and physical judgments would yield different levels of accuracy due to distinct cognitive and perceptual processes involved in each method. Our findings supported these hypotheses, revealing key differences in size perception between the two judgment methods and viewing conditions. Participants consistently underestimated object sizes when using verbal reports in both AR and real-world conditions, with more pronounced errors in AR. In contrast, physical judgments yielded more accurate size estimates under both viewing conditions. Notably, the accuracy of verbal reports decreased as target sizes increased, a trend that was particularly evident in AR. These results underscore the perceptual challenges associated with verbal size judgments in AR and their potential limitations in applications requiring precise size estimations. By highlighting the differences in accuracy and consistency between verbal and physical judgment methods, this study contributes to a deeper understanding of size perception in OST AR and real-world contexts. 
    more » « less
  4. Rogowitz, Bernice E; Pappas, Thrasyvoulos N (Ed.)
    Augmented reality (AR) combines elements of the real world with additional virtual content, creating a blended viewing environment. Optical see-through AR (OST-AR) accomplishes this by using a transparent beam splitter to overlay virtual elements over a user’s view of the real world. However, the inherent see-through nature of OST-AR carries challenges for color appearance, especially around the appearance of darker and less chromatic objects. When displaying human faces—a promising application of AR technology—these challenges disproportionately affect darker skin tones, making them appear more transparent than lighter skin tones. Still, some transparency in the rendered object may not be entirely negative; people’s evaluations of transparency when interacting with other humans in AR-mediated modalities are not yet fully understood. In this work, two psychophysical experiments were conducted to assess how people evaluate OST-AR transparency across several characteristics including different skin tones, object types, lighting conditions, and display types. The results provide a scale of perceived transparency allowing comparisons to transparency for conventional emissive displays. The results also demonstrate how AR transparency impacts perceptions of object preference and fit within the environment. These results reveal several areas with need for further attention, particularly regarding darker skin tones, lighter ambient lighting, and displaying human faces more generally. This work may be useful in guiding the development of OST-AR technology, and emphasizes the importance of AR design goals, perception of human faces, and optimizing visual appearance in extended reality systems. 
    more » « less
  5. Near-eye display systems for augmented reality (AR) aim to seamlessly merge virtual content with the user’s view of the real-world. A substantial limitation of current systems is that they only present virtual content over a limited portion of the user’s natural field of view (FOV). This limitation reduces the immersion and utility of these systems. Thus, it is essential to quantify FOV coverage in AR systems and understand how to maximize it. It is straightforward to determine the FOV coverage for monocular AR systems based on the system architecture. However, stereoscopic AR systems that present 3D virtual content create a more complicated scenario because the two eyes’ views do not always completely overlap. The introduction of partial binocular overlap in stereoscopic systems can potentially expand the perceived horizontal FOV coverage, but it can also introduce perceptual nonuniformity artifacts. In this arrticle, we first review the principles of binocular FOV overlap for natural vision and for stereoscopic display systems. We report the results of a set of perceptual studies that examine how different amounts and types of horizontal binocular overlap in stereoscopic AR systems influence the perception of nonuniformity across the FOV. We then describe how to quantify the horizontal FOV in stereoscopic AR when taking 3D content into account. We show that all stereoscopic AR systems result in a variable horizontal FOV coverage and variable amounts of binocular overlap depending on fixation distance. Taken together, these results provide a framework for optimizing perceived FOV coverage and minimizing perceptual artifacts in stereoscopic AR systems for different use cases. 
    more » « less