skip to main content


Search for: All records

Award ID contains: 1937565

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Triangulation by walking is a method that has been used to measure perceived distance, where observers walk a triangular path. This method has been used at action space distances of approximately 1.5 to 30 meters. In this work, a conceptual replication of these triangulation by walking methods are discussed and evaluated for use in measuring the perceived distance of an object seen through a window set into a wall. The motivation for this work is to use triangulation by walking to study how perceived distance operates when augmented reality (AR) is used to visualize objects located behind opaque surfaces, in an AR application termed “x-ray vision.” This paper reports on experiences replicating an implementation of triangulation by walking as reported by Fukusima, Da Silva, and Loomis (1997). Their method was conceptually replicated in both outdoor and indoor settings, and the method was further extended to measure perceived distances of objects seen through a wall. These extensions are discussed in some detail, focusing on the modifications to the triangulation by walking method as well as the ramifications of these changes. Problems arising from using triangular geometry in calculations of perceived target locations are also introduced, and an alternate method is proposed that works to diminish the problematic effects. 
    more » « less
  2. Blascheck, Tanja ; Bradshaw, Jessica ; Vrzakova, Hana (Ed.)
    Virtual Reality (VR) technology has advanced to include eye-tracking, allowing novel research, such as investigating how our visual system coordinates eye movements with changes in perceptual depth. The purpose of this study was to examine whether eye tracking could track perceptual depth changes during a visual discrimination task. We derived two depth-dependent variables from eye tracker data: eye vergence angle (EVA) and interpupillary distance (IPD). As hypothesized, our results revealed that shifting gaze from near-to-far depth significantly decreased EVA and increased IPD, while the opposite pattern was observed while shifting from far-to-near. Importantly, the amount of change in these variables tracked closely with relative changes in perceptual depth, and supported the hypothesis that eye tracker data may be used to infer real-time changes in perceptual depth in VR. Our method could be used as a new tool to adaptively render information based on depth and improve the VR user experience. 
    more » « less
  3. Cho, Isaac ; Hoermann, Simon ; Krösl, Katharina ; Zielasko, Daniel ; Cidota, Marina (Ed.)
    Accurate and usable x-ray vision is a significant goal in augmented reality (AR) development. X-ray vision, or the ability to comprehend location and object information when it is presented through an opaque barrier, needs to successfully convey scene information to be a viable use case for AR. Further, this investigation should be performed in an ecologically valid context in order to best test x-ray vision. This research seeks to experimentally evaluate the perceived object location of stimuli presented with x-ray vision, as compared to real-world perceived object location through a window, at action space distances of 1.5 to 15 meters. 
    more » « less
  4. Cho, Isaac ; Hoermann, Simon ; Krösl, Katharina ; Zielasko, Daniel ; Cidota, Marina (Ed.)
    An important research question in optical see-through (OST) augmented reality (AR) is, how accurately and precisely can a virtual object’s real world location be perceived? Previously, a method was developed to measure the perceived three-dimensional location of virtual objects in OST AR. In this research, a replication study is reported, which examined whether the perceived location of virtual objects are biased in the direction of the dominant eye. The successful replication analysis suggests that perceptual accuracy is not biased in the direction of the dominant eye. Compared to the previous study’s findings, overall perceptual accuracy increased, and precision was similar. 
    more » « less
  5. Many augmented reality (AR) applications require observers to shift their gaze between AR and real-world content. To date, commercial optical see-through (OST) AR displays have presented content at either a single focal distance, or at a small number of fixed focal distances. Meanwhile, real-world stimuli can occur at a variety of focal distances. Therefore, when shifting gaze between AR and real-world content, in order to view new content in sharp focus, observers must often change their eye’s accommodative state. When performed repetitively, this can negatively affect task performance and eye fatigue. However, these effects may be under reported, because past research has not yet considered the potential additional effect of distracting real world backgrounds.An experimental method that analyzes background effects is presented, using a text-based visual search task that requires integrating information presented in both AR and the real world. An experiment is reported, which examined the effect of a distracting background versus a blank background, at focal switching distances of 0, 1.33, 2.0, and 3.33 meters. Qualitatively, a majority of the participants reported that the distracting background made the task more difficult and fatiguing. Quantitatively, increasing the focal switching distance resulted in reduced task performance and increased eye fatigue. However, changing the background, between blank and distracting, did not result in significant measured differences. Suggestions are given for further efforts to examine background effects. 
    more » « less
  6. For optical see-through augmented reality (AR), a new method for measuring the perceived three-dimensional location of virtual objects is presented, where participants verbally report a virtual object’s location relative to both a vertical and horizontal grid. The method is tested with a small (1.95 × 1.95 × 1.95 cm) virtual object at distances of 50 to 80 cm, viewed through a Microsoft HoloLens 1 st generation AR display. Two experiments examine two different virtual object designs, whether turning in a circle between reported object locations disrupts HoloLens tracking, and whether accuracy errors, including a rightward bias and underestimated depth, might be due to systematic errors that are restricted to a particular display. Turning in a circle did not disrupt HoloLens tracking, and testing with a second display did not suggest systematic errors restricted to a particular display. Instead, the experiments are consistent with the hypothesis that, when looking downwards at a horizontal plane, HoloLens 1 st generation displays exhibit a systematic rightward perceptual bias. Precision analysis suggests that the method could measure the perceived location of a virtual object within an accuracy of less than 1 mm. 
    more » « less
  7. Usable x-ray vision has long been a goal in augmented reality research and development. X-ray vision, or the ability to view and understand information presented through an opaque barrier, would be imminently useful across a variety of domains. Unfortunately, however, the effect of x-ray vision on situation awareness, an operator's understanding of a task or environment, has not been significantly studied. This is an important question; if x-ray vision does not increase situation awareness, of what use is it? Thus, we have developed an x-ray vision system, in order to investigate situation awareness in the context of action space distances. 
    more » « less
  8. For optical, see-through augmented reality (AR), a new method for measuring the perceived three-dimensional location of a small virtual object is presented, where participants verbally report the virtual object's location relative to both a horizontal and vertical grid. The method is tested with a Microsoft HoloLens AR display, and examines two different virtual object designs, whether turning in a circle between reported object locations disrupts HoloLens tracking, and whether accuracy errors found with a HoloLens display might be due to systematic errors that are restricted to that particular display. Turning in a circle did not disrupt HoloLens tracking, and a second HoloLens did not suggest systematic errors restricted to a specific display. The proposed method could measure the perceived location of a virtual object to a precision of ~1 mm. 
    more » « less
  9. In room-clearing tasks, SWAT team members suffer from a lack of initialenvironmental information: knowledge about what is in a room and what relevance or threat level it represents for mission parameters. Normally this gap in situation awareness is rectified only upon room entry, forcing SWAT team members to rely on quick responses and near-instinctual reactions. This can lead to dangerously escalating situations or important missed information which, in turn, can increase the likelihood of injury and even mortality. Thus, we present an x-ray vision system for the dynamic scanning and display of room content, using a robotic platform to mitigate operator risk. This system maps a room using a robot-equipped stereo depth camera and, using an augmented reality (AR) system, presents the resulting geographic information according to the perspective of each officer. This intervention has the potential to notably lower risk and increase officer situation awareness, all while team members are in the relative safety of cover. With these potential stakes, it is important to test the viability of this system natively and in an operational SWAT team context. 
    more » « less