skip to main content

Title: The Rise of Allocentric Interfaces and the Collapse of the Virtuality Continuum
The popular concepts of Virtual Reality (VR) and Augmented Reality (AR) arose from our ability to interact with objects and environments that appear to be real, but are not. One of the most powerful aspects of these paradigms is the ability of virtual entities to embody a richness of behavior and appearance that we perceive as compatible with reality, and yet unconstrained by reality. The freedom to be or do almost anything helps to reinforce the notion that such virtual entities are inherently distinct from the real world—as if they were magical. This independent magical status is reinforced by the typical need for the use of “magic glasses” (head-worn displays) and “magic wands” (spatial interaction devices) that are ceremoniously bestowed on a chosen few. For those individuals, the experience is inherently egocentric in nature—the sights and sounds effectively emanate from the magic glasses, not the real world, and unlike the magic we are accustomed to from cinema, the virtual entities are unable to affect the real world. This separation of real and virtual is also inherent in our related conceptual frameworks, such as Milgram’s Virtuality Continuum, where the real and virtual are explicitly distinguished and mixed. While these frameworks are indeed conceptual, we often feel the need to position our systems and research somewhere in the continuum, further reinforcing the notion more » that real and virtual are distinct. The very structures of our professional societies, our research communities, our journals, and our conferences tend to solidify the evolutionary separation of the virtual from the real. However, independent forces are emerging that could reshape our notions of what is real and virtual, and transform our sense of what it means to interact with technology. First, even within the VR/AR communities, as the appearance and behavioral realism of virtual entities improves, virtual experiences will become more real. Second, as domains such as artificial intelligence, robotics, and the Internet of Things (IoT) mature and permeate throughout our lives, experiences with real things will become more virtual. The convergence of these various domains has the potential to transform the egocentric magical nature of VR/AR into more pervasive allocentric magical experiences and interfaces that interact with and can affect the real world. This transformation will blur traditional technological boundaries such that experiences will no longer be distinguished as real or virtual, and our sense for what is natural will evolve to include what we once remember as cinematic magic. « less
Award ID(s):
Publication Date:
Journal Name:
Symposium on Spatial User Interaction
Page Range or eLocation-ID:
192 to 192
Sponsoring Org:
National Science Foundation
More Like this
  1. In a seminal article on augmented reality (AR) [7], Ron Azuma defines AR as a variation of virtual reality (VR), which completely immerses a user inside a synthetic environment. Azuma says “In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world” [7] (emphasis added). Typically, a user wears a tracked stereoscopic head-mounted display (HMD) or holds a smartphone, showing the real world through optical or video means, with superimposed graphics that provide the appearance of virtual content that is related to and registered with the real world. While AR has been around since the 1960s [72], it is experiencing a renaissance of development and consumer interest. With exciting products from Microsoft (HoloLens), Metavision (Meta 2), and others; Apple’s AR Developer’s Kit (ARKit); and well-funded startups like Magic Leap [54], the future is looking even brighter, expecting that AR technologies will be absorbed into our daily lives and have a strong influence on our society in the foreseeable future.
  2. This poster presents the use of Augmented Reality (AR) and Virtual Reality (VR) to tackle 4 amongst the “14 Grand Challenges for Engineering in the 21st Century” identified by National Academy of Engineering. AR and VR are the technologies of the present and the future. AR creates a composite view by adding digital content to a real world view, often by using the camera of a smartphone and VR creates an immersive view where the user’s view is often cut off from the real world. The 14 challenges identify areas of science and technology that are achievable and sustainable to assist people and the planet to prosper. The 4 challenges tackled using AR/VR application in this poster are: Enhance virtual reality, Advance personalized learning, Provide access to clean water, and Make solar energy affordable. The solar system VR application is aimed at tackling two of the engineering challenges: (1) Enhance virtual reality and (2) Advance personalized learning. The VR application assists the user in visualizing and understanding our solar system by using a VR headset. It includes an immersive 360 degree view of our solar system where the user can use controllers to interact with celestial bodies-related information and tomore »teleport to different points in the space to have a closer look at the planets and the Sun. The user has six degrees of freedom. The AR application for water tackles the engineering challenge: “Provide access to clean water”. The AR water application shows information on drinking water accessibility and the eco-friendly usage of bottles over plastic cups within the department buildings inside Auburn University. The user of the application has an augmented view of drinking water information on a smartphone. Every time the user points the smartphone camera towards a building, the application will render a composite view with drinking water information associated to the building. The Sun path visualization AR application tackles the engineering challenge: “Make solar energy affordable”. The application helps the user visualize sun path at a selected time and location. The sun path is augmented in the camera view of the device when the user points the camera towards the sky. The application provides information on sun altitude and azimuth. Also, it provides the user with sunrise and sunset data for a selected day. The information provided by the application can aid the user with effective solar panel placement. Using AR and VR technology to tackle these challenges enhances the user experience. The information from these applications are better curated and easily visualized, thus readily understandable by the end user. Therefore, usage of AR and VR technology to tackle these type of engineering challenges looks promising.« less
  3. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward researchmore »towards more realistic physicality in future VR/AR.« less
  4. A solid understanding of electromagnetic (E&M) theory is key to the education of electrical engineering students. However, these concepts are notoriously challenging for students to learn, due to the difficulty in grasping abstract concepts such as the electric force as an invisible force that is acting at a distance, or how electromagnetic radiation is permeating and propagating in space. Building physical intuition to manipulate these abstractions requires means to visualize them in a three-dimensional space. This project involves the development of 3D visualizations of abstract E&M concepts in Virtual Reality (VR), in an immersive, exploratory, and engaging environment. VR provides the means of exploration, to construct visuals and manipulable objects to represent knowledge. This leads to a constructivist way of learning, in the sense that students are allowed to build their own knowledge from meaningful experiences. In addition, the VR labs replace the cost of hands-on labs, by recreating the experiments and experiences on Virtual Reality platforms. The development of the VR labs for E&M courses involves four distinct phases: (I) Lab Design, (II) Experience Design, (III) Software Development, and (IV) User Testing. During phase I, the learning goals and possible outcomes are clearly defined, to provide context for themore »VR laboratory experience, and to identify possible technical constraints pertaining to the specific laboratory exercise. During stage II, the environment (the world) the player (user) will experience is designed, along with the foundational elements, such as ways of navigation, key actions, and immersion elements. During stage III, the software is generated as part of the course projects for the Virtual Reality course taught in the Computer Science Department at the same university, or as part of independent research projects involving engineering students. This reflects the strong educational impact of this project, as it allows students to contribute to the educational experiences of their peers. During phase IV, the VR experiences are played by different types of audiences that fit the player type. The team collects feedback and if needed, implements changes. The pilot VR Lab, introduced as an additional instructional tool for the E&M course during the Fall 2019, engaged over 100 students in the program, where in addition to the regular lectures, students attended one hour per week in the E&M VR lab. Student competencies around conceptual understanding of electromagnetism topics are measured via formative and summative assessments. To evaluate the effectiveness of VR learning, each lab is followed by a 10-minute multiple-choice test, designed to measure conceptual understanding of the various topics, rather than the ability to simply manipulate equations. This paper discusses the implementation and the pedagogy of the Virtual Reality laboratory experiences to visualize concepts in E&M, with examples for specific labs, as well as challenges, and student feedback with the new approach. We will also discuss the integration of the 3D visualizations into lab exercises, and the design of the student assessment tools used to assess the knowledge gain when the VR technology is employed.« less
  5. Though virtual reality (VR) has been advanced to certain levels of maturity in recent years, the general public, especially the population of the blind and visually impaired (BVI), still cannot enjoy the benefit provided by VR. Current VR accessibility applications have been developed either on expensive head-mounted displays or with extra accessories and mechanisms, which are either not accessible or inconvenient for BVI individuals. In this paper, we present a mobile VR app that enables BVI users to access a virtual environment on an iPhone in order to build their skills of perception and recognition of the virtual environment and the virtual objects in the environment. The app uses the iPhone on a selfie stick to simulate a long cane in VR, and applies Augmented Reality (AR) techniques to track the iPhone’s real-time poses in an empty space of the real world, which is then synchronized to the long cane in the VR environment. Due to the use of mixed reality (the integration of VR & AR), we call it the Mixed Reality cane (MR Cane), which provides BVI users auditory and vibrotactile feedback whenever the virtual cane comes in contact with objects in VR. Thus, the MR Cane allowsmore »BVI individuals to interact with the virtual objects and identify approximate sizes and locations of the objects in the virtual environment. We performed preliminary user studies with blind-folded participants to investigate the effectiveness of the proposed mobile approach and the results indicate that the proposed MR Cane could be effective to help BVI individuals in understanding the interaction with virtual objects and exploring 3D virtual environments. The MR Cane concept can be extended to new applications of navigation, training and entertainment for BVI individuals without more significant efforts.« less