Lighting understanding plays an important role in virtual object composition, including mobile augmented reality (AR) applications. Prior work often targets recovering lighting from the physical environment to support photorealistic AR rendering. Because the common workflow is to use a back-facing camera to capture the physical world for overlaying virtual objects, we refer to this usage pattern as back-facing AR. However, existing methods often fall short in supporting emerging front-facing mobile AR applications, e.g., virtual try-on where a user leverages a front-facing camera to explore the effect of various products (e.g., glasses or hats) of different styles. This lack of support can be attributed to the unique challenges of obtaining 360° HDR environment maps, an ideal format of lighting representation, from the front-facing camera and existing techniques. In this paper, we propose to leverage dual-camera streaming to generate a high-quality environment map by combining multi-view lighting reconstruction and parametric directional lighting estimation. Our preliminary results show improved rendering quality using a dual-camera setup for front-facing AR compared to a commercial solution.
more »
« less
Designing a Multitasking Interface for Object-aware AR applications
Many researchers and industry professionals believe Augmented Reality (AR) to be the next step in personal computing. However, the idea of an always-on context-aware AR device presents new and unique challenges to the way users organize multiple streams of information. What does multitasking look like and when should applications be tied to specific elements in the environment? In this exploratory study, we look at one such element: physical objects, and explore an object-centric approach to multitasking in AR. We developed 3 prototype applications that operate on a subset of objects in a simulated test environment. We performed a pilot study of our multitasking solution with a novice user, domain expert, and system expert to develop insights into the future of AR application design.
more »
« less
- PAR ID:
- 10332221
- Date Published:
- Journal Name:
- 2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)
- Page Range / eLocation ID:
- 39 to 40
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
null (Ed.)Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public, especially the population of the blind and visually impaired (BVI), still cannot enjoy the benefit provided by VR. Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are either not accessible or inconvenient for BVI individuals. In this paper, we present a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR Cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes in contact with objects in VR. Thus, the MR Cane allows BVI individuals to interact with the virtual objects and identify approximate sizes and locations of the objects in the virtual environment. We performed preliminary user studies with blind-folded participants to investigate the effectiveness of the proposed mobile approach and the results indicate that the proposed MR Cane could be effective to help BVI individuals in understanding the interaction with virtual objects and exploring 3D virtual environments. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.more » « less
-
Augmented Reality (AR) technology offers the possibility of experiencing virtual images with physical objects and provides high quality hands-on experiences in an engineering lab environment. However, students still need help navigating the educational content in AR environments due to a mismatch problem between computer-generated 3D images and actual physical objects. This limitation could significantly influence their learning processes and workload in AR learning. In addition, a lack of student awareness of their learning process in AR environments could negatively impact their performance improvement. To overcome those challenges, we introduced a virtual instructor in each AR module and asked a metacognitive question to improve students’ metacognitive skills. The results showed that student workload was significantly reduced when a virtual instructor guided students during AR learning. Also, there is a significant correlation between student learning performance and workload when they are overconfident. The outcome of this study will provide knowledge to improve the AR learning environment in higher education settings.more » « less
-
null (Ed.)Augmented reality (AR) is an efficient form of delivering spatial information and has great potential for training workers. However, AR is still not widely used for such scenarios due to the technical skills and expertise required to create interactive AR content. We developed ProcessAR, an AR-based system to develop 2D/3D content that captures subject matter expert’s (SMEs) environment-object interactions in situ. The design space for ProcessAR was identified from formative interviews with AR programming experts and SMEs, alongside a comparative design study with SMEs and novice users. To enable smooth workflows, ProcessAR locates and identifies different tools/objects through computer vision within the workspace when the author looks at them. We explored additional features such as embedding 2D videos with detected objects and user-adaptive triggers. A final user evaluation comparing ProcessAR and a baseline AR authoring environment showed that, according to our qualitative questionnaire, users preferred ProcessAR.more » « less
-
Providing students with hands-on construction experiences enables them to apply conceptual knowledge to practical applications, but the high costs associated with this form of learning limit access to it. Therefore, this paper explores the use of augmented reality (AR) to enable students in a conventional classroom or lab setting to interact with virtual objects similar to how they would if they were physically constructing building components. More specifically, the authors tasked student participants with virtually constructing a wood-framed wall through AR with a Microsoft HoloLens. Participants were video-recorded and their behaviors were analyzed. Subsequently, observed behaviors in AR were analyzed and compared to expected behaviors in the physical environment. It was observed that students performing the tasks tended to mimic behaviors found in the physical environment in how they managed the virtual materials, leveraged physical tools in conjunction with virtual materials, and in their ability to recognize and fix mistakes. Some of the finer interactions observed with the virtual materials were found to be unique to the virtual environment, such as moving objects from a distance. Overall, these findings contribute to the understanding of how AR may be leveraged in classrooms to provide learning experiences that yield similar outcomes to those provided in more resource-intensive physical construction site environments.more » « less
An official website of the United States government

