- Award ID(s):
- 2238313
- PAR ID:
- 10510409
- Publisher / Repository:
- Springer
- Date Published:
- Journal Name:
- In: Bebis, G., et al. Advances in Visual Computing. ISVC 2023. Lecture Notes in Computer Science
- ISBN:
- 978-3-031-47966-3
- Subject(s) / Keyword(s):
- Augmented Reality · Virtual Reality · Immersive Analytics · User Interaction
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
In a seminal article on augmented reality (AR) [7], Ron Azuma defines AR as a variation of virtual reality (VR), which completely immerses a user inside a synthetic environment. Azuma says “In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world” [7] (emphasis added). Typically, a user wears a tracked stereoscopic head-mounted display (HMD) or holds a smartphone, showing the real world through optical or video means, with superimposed graphics that provide the appearance of virtual content that is related to and registered with the real world. While AR has been around since the 1960s [72], it is experiencing a renaissance of development and consumer interest. With exciting products from Microsoft (HoloLens), Metavision (Meta 2), and others; Apple’s AR Developer’s Kit (ARKit); and well-funded startups like Magic Leap [54], the future is looking even brighter, expecting that AR technologies will be absorbed into our daily lives and have a strong influence on our society in the foreseeable future.more » « less
-
Augmented reality (AR) is a technology that integrates 3D virtual objects into the physical world in real-time, while virtual reality (VR) is a technology that immerses users in an interactive 3D virtual environment. The fast development of augmented reality (AR) and virtual reality (VR) technologies has reshaped how people interact with the physical world. This presentation will outline the results from two unique AR and one Web-based VR coastal engineering projects, motivating the next stage in the development of the augmented reality package for coastal students, engineers, and planners.
-
Near-eye display systems for augmented reality (AR) aim to seamlessly merge virtual content with the user’s view of the real-world. A substantial limitation of current systems is that they only present virtual content over a limited portion of the user’s natural field of view (FOV). This limitation reduces the immersion and utility of these systems. Thus, it is essential to quantify FOV coverage in AR systems and understand how to maximize it. It is straightforward to determine the FOV coverage for monocular AR systems based on the system architecture. However, stereoscopic AR systems that present 3D virtual content create a more complicated scenario because the two eyes’ views do not always completely overlap. The introduction of partial binocular overlap in stereoscopic systems can potentially expand the perceived horizontal FOV coverage, but it can also introduce perceptual nonuniformity artifacts. In this arrticle, we first review the principles of binocular FOV overlap for natural vision and for stereoscopic display systems. We report the results of a set of perceptual studies that examine how different amounts and types of horizontal binocular overlap in stereoscopic AR systems influence the perception of nonuniformity across the FOV. We then describe how to quantify the horizontal FOV in stereoscopic AR when taking 3D content into account. We show that all stereoscopic AR systems result in a variable horizontal FOV coverage and variable amounts of binocular overlap depending on fixation distance. Taken together, these results provide a framework for optimizing perceived FOV coverage and minimizing perceptual artifacts in stereoscopic AR systems for different use cases.more » « less
-
Virtual reality (VR) is a relatively new and rapidly growing field which is becoming accessible by the larger research community as well as being commercially available for en- tertainment. Relatively cheap and commercially available head mounted displays (HMDs) are the largest reason for this increase in availability. This work uses Unity and an HMD to create a VR environment to display a 360◦video of a pre-recorded patient handoff be- tween a nurse and doctor. The VR environment went through different designs while in development. This works discusses each stage of it’s design and the unique challenges we encountered during development. This work also discusses the implementation of the user study and the visualization of collected eye tracking data.
-
Demand is growing for markerless augmented reality (AR) experiences, but designers of the real-world spaces that host them still have to rely on inexact, qualitative guidelines on the visual environment to try and facilitate accurate pose tracking. Furthermore, the need for visual texture to support markerless AR is often at odds with human aesthetic preferences, and understanding how to balance these competing requirements is challenging due to the siloed nature of the relevant research areas. To address this, we present an integrated design methodology for AR spaces, that incorporates both tracking and human factors into the design process. On the tracking side, we develop the first VI-SLAM evaluation technique that combines the flexibility and control of virtual environments with real inertial data. We use it to perform systematic, quantitative experiments on the effect of visual texture on pose estimation accuracy; through 2000 trials in 20 environments, we reveal the impact of both texture complexity and edge strength. On the human side, we show how virtual reality (VR) can be used to evaluate user satisfaction with environments, and highlight how this can be tailored to AR research and use cases. Finally, we demonstrate our integrated design methodology with a case study on AR museum design, in which we conduct both VI-SLAM evaluations and a VR-based user study of four different museum environments.more » « less