As visualization makes the leap to mobile and situated settings, where data is increasingly integrated with the physical world using mixed reality, there is a corresponding need for effectively managing the immersed user's view of situated visualizations. In this paper we present an analysis of view management techniques for situated 3D visualizations in handheld augmented reality: a shadowbox, a world‐in‐miniature metaphor, and an interactive tour. We validate these view management solutions through a concrete implementation of all techniques within a situated visualization framework built using a web‐based augmented reality visualization toolkit, and present results from a user study in augmented reality accessed using handheld mobile devices.
more » « less- NSF-PAR ID:
- 10426652
- Publisher / Repository:
- Wiley-Blackwell
- Date Published:
- Journal Name:
- Computer Graphics Forum
- Volume:
- 42
- Issue:
- 3
- ISSN:
- 0167-7055
- Page Range / eLocation ID:
- p. 349-360
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Emergency response, navigation, and evacuation are key essentials for effective rescue and safety management. Situational awareness is a key ingredient when fire responders or emergency response personnel responds to an emergency. They have to quickly assess the layout of a building or a campus upon entry. Moreover, the occupants of a building or campus also need situational awareness for navigation and emergency response. We have developed an integrated situational awareness mobile augmented reality (AR) application for smart campus planning, management, and emergency response. Through the visualization of integrated geographic information systems and real-time data analysis, our mobile application provides insights into operational implications and offers information to support effective decision-making. Using existing building features, the authors demonstrate how the mobile AR application provides contextualized 3D visualizations that promote and support spatial knowledge acquisition and cognitive mapping thereby enhancing situational awareness. A limited user study was conducted to test the effectiveness of the proposed mobile AR application using the mobile phone usability questionnaire (MPUQ) framework. The results show that the mobile AR application was relatively easy to use and that it can be considered a useful application for navigation and evacuation.more » « less
-
Feel the Globe: Enhancing the Perception of Immersive Spherical Visualizations with Tangible ProxiesRecent developments in the commercialization of virtual reality open up many opportunities for enhancing human interaction with three-dimensional objects and visualizations. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. In a lab study, we investigate the effects of the perception of actually holding a virtual spherical visualization in hands. As use cases, we focus on surface visualizations that benefit from or require a rounded shape. We compared the usage of two differently sized acrylic glass spheres to a related interaction technique that utilizes VR controllers as proxies. On the one hand, our work is motivated by the ability to create in VR a tangible, lightweight, handheld spherical display that can hardly be realized in reality. On the other hand, gaining insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior is important in its own right. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection.more » « less
-
This poster presents the use of Augmented Reality (AR) and Virtual Reality (VR) to tackle 4 amongst the “14 Grand Challenges for Engineering in the 21st Century” identified by National Academy of Engineering. AR and VR are the technologies of the present and the future. AR creates a composite view by adding digital content to a real world view, often by using the camera of a smartphone and VR creates an immersive view where the user’s view is often cut off from the real world. The 14 challenges identify areas of science and technology that are achievable and sustainable to assist people and the planet to prosper. The 4 challenges tackled using AR/VR application in this poster are: Enhance virtual reality, Advance personalized learning, Provide access to clean water, and Make solar energy affordable. The solar system VR application is aimed at tackling two of the engineering challenges: (1) Enhance virtual reality and (2) Advance personalized learning. The VR application assists the user in visualizing and understanding our solar system by using a VR headset. It includes an immersive 360 degree view of our solar system where the user can use controllers to interact with celestial bodies-related information and to teleport to different points in the space to have a closer look at the planets and the Sun. The user has six degrees of freedom. The AR application for water tackles the engineering challenge: “Provide access to clean water”. The AR water application shows information on drinking water accessibility and the eco-friendly usage of bottles over plastic cups within the department buildings inside Auburn University. The user of the application has an augmented view of drinking water information on a smartphone. Every time the user points the smartphone camera towards a building, the application will render a composite view with drinking water information associated to the building. The Sun path visualization AR application tackles the engineering challenge: “Make solar energy affordable”. The application helps the user visualize sun path at a selected time and location. The sun path is augmented in the camera view of the device when the user points the camera towards the sky. The application provides information on sun altitude and azimuth. Also, it provides the user with sunrise and sunset data for a selected day. The information provided by the application can aid the user with effective solar panel placement. Using AR and VR technology to tackle these challenges enhances the user experience. The information from these applications are better curated and easily visualized, thus readily understandable by the end user. Therefore, usage of AR and VR technology to tackle these type of engineering challenges looks promising.more » « less
-
Abstract We investigated how families experienced immersion as they collaboratively made sense of geologic time and geoscience processes during a place-based, learning-on-the-move (LOTM) experience mediated by a mobile augmented reality (MAR) app. Our team developed an MAR app,
Time Explorers , that focused on how rock-water interactions shaped Appalachia over millions of years. Data were collected at the Children’s Garden at the Arboretum at Penn State. Data sources were videos of app usage, point-of-view camera recordings with audio capturing family conversations, and interviews from 17 families (51 people). The analytical technique was interaction analysis, in which episodes of family sense-making were identified and developed into qualitative vignettes focused on how immersion did or did not support learning about geoscience and geologic time. We analyzed how design elements supported sensory, actional, narrative, and social immersion through photo-taking, discussion prompts, and augmented reality visualizations. Findings showed that sensory and social immersion supported sense-making conversations and observational inquiry, while narrative and actional immersion supported deep family engagement with the geoscience content. At many micro-sites of learning, families engaged in multiple immersive processes where conversations, observational inquiry, and deep engagement with the geoscience came together during LOTM. This analysis contributes to the CSCL literature on theory related to LOTM in outdoor informal settings, while also providing design conjectures in an immersive, family-centered, place-based LOTM framework. -
Lighting understanding plays an important role in virtual object composition, including mobile augmented reality (AR) applications. Prior work often targets recovering lighting from the physical environment to support photorealistic AR rendering. Because the common workflow is to use a back-facing camera to capture the physical world for overlaying virtual objects, we refer to this usage pattern as back-facing AR. However, existing methods often fall short in supporting emerging front-facing mobile AR applications, e.g., virtual try-on where a user leverages a front-facing camera to explore the effect of various products (e.g., glasses or hats) of different styles. This lack of support can be attributed to the unique challenges of obtaining 360° HDR environment maps, an ideal format of lighting representation, from the front-facing camera and existing techniques. In this paper, we propose to leverage dual-camera streaming to generate a high-quality environment map by combining multi-view lighting reconstruction and parametric directional lighting estimation. Our preliminary results show improved rendering quality using a dual-camera setup for front-facing AR compared to a commercial solution.more » « less