skip to main content


Title: Sphere in Hand: Exploring Tangible Interaction with Immersive Spherical Visualizations
The emerging possibilities of data analysis and exploration in virtual reality raise the question of how users can be best supported during such interactions. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. This work is motivated by the prospect to create in VR a low-cost, tangible, robust, handheld spherical display that would be difficult or impossible to implement as a physical display. Our concept enables it to gain insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection.  more » « less
Award ID(s):
1748392
NSF-PAR ID:
10146469
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
IEEE Virtual Reality 2019
Page Range / eLocation ID:
912 to 913
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Recent developments in the commercialization of virtual reality open up many opportunities for enhancing human interaction with three-dimensional objects and visualizations. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. In a lab study, we investigate the effects of the perception of actually holding a virtual spherical visualization in hands. As use cases, we focus on surface visualizations that benefit from or require a rounded shape. We compared the usage of two differently sized acrylic glass spheres to a related interaction technique that utilizes VR controllers as proxies. On the one hand, our work is motivated by the ability to create in VR a tangible, lightweight, handheld spherical display that can hardly be realized in reality. On the other hand, gaining insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior is important in its own right. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection. 
    more » « less
  2. In this paper, we explore how a familiarly shaped object can serve as a physical proxy to manipulate virtual objects in Augmented Reality (AR) environments. Using the example of a tangible, handheld sphere, we demonstrate how irregularly shaped virtual objects can be selected, transformed, and released. After a brief description of the implementation of the tangible proxy, we present a buttonless interaction technique suited to the characteristics of the sphere. In a user study (N = 30), we compare our approach with three different controller-based methods that increasingly rely on physical buttons. As a use case, we focused on an alignment task that had to be completed in mid-air as well as on a flat surface. Results show that our concept has advantages over two of the controller-based methods regarding task completion time and user ratings. Our findings inform research on integrating tangible interaction into AR experiences. 
    more » « less
  3. null (Ed.)
    Technological advancements and increased access have prompted the adoption of head- mounted display based virtual reality (VR) for neuroscientific research, manual skill training, and neurological rehabilitation. Applications that focus on manual interaction within the virtual environment (VE), especially haptic-free VR, critically depend on virtual hand-object collision detection. Knowledge about how multisensory integration related to hand-object collisions affects perception-action dynamics and reach-to-grasp coordination is needed to enhance the immersiveness of interactive VR. Here, we explored whether and to what extent sensory substitution for haptic feedback of hand-object collision (visual, audio, or audiovisual) and collider size (size of spherical pointers representing the fingertips) influences reach-to-grasp kinematics. In Study 1, visual, auditory, or combined feedback were compared as sensory substitutes to indicate the successful grasp of a virtual object during reach-to-grasp actions. In Study 2, participants reached to grasp virtual objects using spherical colliders of different diameters to test if virtual collider size impacts reach-to-grasp. Our data indicate that collider size but not sensory feedback modality significantly affected the kinematics of grasping. Larger colliders led to a smaller size-normalized peak aperture. We discuss this finding in the context of a possible influence of spherical collider size on the perception of the virtual object’s size and hence effects on motor planning of reach-to-grasp. Critically, reach-to-grasp spatiotemporal coordination patterns were robust to manipulations of sensory feedback modality and spherical collider size, suggesting that the nervous system adjusted the reach (transport) component commensurately to the changes in the grasp (aperture) component. These results have important implications for research, commercial, industrial, and clinical applications of VR. 
    more » « less
  4. The goal of this research is to provide much needed empirical data on how the fidelity of popular hand gesture tracked based pointing metaphors versus commodity controller based input affects the efficiency and speed-accuracy tradeoff in users’ spatial selection in personal space interactions in VR. We conduct two experiments in which participants select spherical targets arranged in a circle in personal space, or near-field within their maximum arms reach distance, in VR. Both experiments required participants to select the targets with either a VR controller or with their dominant hand’s index finger, which was tracked with one of two popular contemporary tracking methods. In the first experiment, the targets are arranged in a flat circle in accordance with the ISO 9241-9 Fitts’ law standard, and the simulation selected random combinations of 3 target amplitudes and 3 target widths. Targets were placed centered around the users’ eye level, and the arrangement was placed at either 60%, 75%, or 90% depth plane of the users’ maximum arm’s reach. In experiment 2, the targets varied in depth randomly from one depth plane to another within the same configuration of 13 targets within a trial set, which resembled button selection task in hierarchical menus in differing depth planes in the near-field. The study was conducted using the HTC Vive head-mounted display, and used either a VR controller (HTC Vive), low-fidelity virtual pointing (Leap Motion), or a high-fidelity virtual pointing (tracked VR glove) conditions. Our results revealed that low-fidelity pointing performed worse than both high-fidelity pointing and the VR controller. Overall, target selection performance was found to be worse in depth planes closer to the maximum arms reach, as compared to middle and nearer distances. 
    more » « less
  5. A solid understanding of electromagnetic (E&M) theory is key to the education of electrical engineering students. However, these concepts are notoriously challenging for students to learn, due to the difficulty in grasping abstract concepts such as the electric force as an invisible force that is acting at a distance, or how electromagnetic radiation is permeating and propagating in space. Building physical intuition to manipulate these abstractions requires means to visualize them in a three-dimensional space. This project involves the development of 3D visualizations of abstract E&M concepts in Virtual Reality (VR), in an immersive, exploratory, and engaging environment. VR provides the means of exploration, to construct visuals and manipulable objects to represent knowledge. This leads to a constructivist way of learning, in the sense that students are allowed to build their own knowledge from meaningful experiences. In addition, the VR labs replace the cost of hands-on labs, by recreating the experiments and experiences on Virtual Reality platforms. The development of the VR labs for E&M courses involves four distinct phases: (I) Lab Design, (II) Experience Design, (III) Software Development, and (IV) User Testing. During phase I, the learning goals and possible outcomes are clearly defined, to provide context for the VR laboratory experience, and to identify possible technical constraints pertaining to the specific laboratory exercise. During stage II, the environment (the world) the player (user) will experience is designed, along with the foundational elements, such as ways of navigation, key actions, and immersion elements. During stage III, the software is generated as part of the course projects for the Virtual Reality course taught in the Computer Science Department at the same university, or as part of independent research projects involving engineering students. This reflects the strong educational impact of this project, as it allows students to contribute to the educational experiences of their peers. During phase IV, the VR experiences are played by different types of audiences that fit the player type. The team collects feedback and if needed, implements changes. The pilot VR Lab, introduced as an additional instructional tool for the E&M course during the Fall 2019, engaged over 100 students in the program, where in addition to the regular lectures, students attended one hour per week in the E&M VR lab. Student competencies around conceptual understanding of electromagnetism topics are measured via formative and summative assessments. To evaluate the effectiveness of VR learning, each lab is followed by a 10-minute multiple-choice test, designed to measure conceptual understanding of the various topics, rather than the ability to simply manipulate equations. This paper discusses the implementation and the pedagogy of the Virtual Reality laboratory experiences to visualize concepts in E&M, with examples for specific labs, as well as challenges, and student feedback with the new approach. We will also discuss the integration of the 3D visualizations into lab exercises, and the design of the student assessment tools used to assess the knowledge gain when the VR technology is employed. 
    more » « less