- Award ID(s):
- 1652210
- PAR ID:
- 10394435
- Date Published:
- Journal Name:
- ACM Transactions on Graphics
- ISSN:
- 0730-0301
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
Feel the Globe: Enhancing the Perception of Immersive Spherical Visualizations with Tangible ProxiesRecent developments in the commercialization of virtual reality open up many opportunities for enhancing human interaction with three-dimensional objects and visualizations. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. In a lab study, we investigate the effects of the perception of actually holding a virtual spherical visualization in hands. As use cases, we focus on surface visualizations that benefit from or require a rounded shape. We compared the usage of two differently sized acrylic glass spheres to a related interaction technique that utilizes VR controllers as proxies. On the one hand, our work is motivated by the ability to create in VR a tangible, lightweight, handheld spherical display that can hardly be realized in reality. On the other hand, gaining insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior is important in its own right. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection.more » « less
-
Abstract In virtual reality (VR), established perception–action relationships break down because of conflicting and ambiguous sensorimotor inputs, inducing walking velocity underestimations. Here, we explore the effects of realigning perceptual sensory experiences with physical movements via augmented feedback on the estimation of virtual speed. We hypothesized that providing feedback about speed would lead to concurrent perceptual improvements and that these alterations would persist once the speedometer was removed. Ten young adults used immersive VR to view a virtual hallway translating at a series of fixed speeds. Participants were tasked with matching their walking speed on a self-paced treadmill to the optic flow in the environment. Information regarding walking speed accuracy was provided during augmented feedback trials via a real-time speedometer. We measured resulting walking velocity errors, as well as kinematic gait parameters. We found that the concordance between the virtual environment and gait speeds was higher when augmented feedback was provided during the trial. Furthermore, we observed retention effects beyond the intervention period via demonstrated smaller errors in speed perception accuracy and stronger concordance between perceived and actual speeds. Together, these results highlight a potential role for augmented feedback in guiding gait strategies that deviate away from predefined internal models of locomotion.
-
While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.more » « less
-
Computer-mediated collaboration has long been a core research interest in CSCW and HCI. As online social spaces continue to evolve towards more immersive and higher fidelity experiences, more research is still needed to investigate how emerging novel technology may foster and support new and more nuanced forms and experiences of collaboration in virtual environments. Using 30 interviews, this paper focuses on what people may collaborate on and how they collaborate in social Virtual Reality (VR). We broaden current studies on computer-mediated collaboration by highlighting the importance of embodiment for co-presence and communication, replicating offline collaborative activities, and supporting the seamless interplay of work, play, and mundane experiences in everyday lives for experiencing and conceptualizing collaboration in emerging virtual environments. We also propose potential design implications that could further support everyday collaborative activities in social VRmore » « less
-
The goal of this research is to provide much needed empirical data on how the fidelity of popular hand gesture tracked based pointing metaphors versus commodity controller based input affects the efficiency and speed-accuracy tradeoff in users’ spatial selection in personal space interactions in VR. We conduct two experiments in which participants select spherical targets arranged in a circle in personal space, or near-field within their maximum arms reach distance, in VR. Both experiments required participants to select the targets with either a VR controller or with their dominant hand’s index finger, which was tracked with one of two popular contemporary tracking methods. In the first experiment, the targets are arranged in a flat circle in accordance with the ISO 9241-9 Fitts’ law standard, and the simulation selected random combinations of 3 target amplitudes and 3 target widths. Targets were placed centered around the users’ eye level, and the arrangement was placed at either 60%, 75%, or 90% depth plane of the users’ maximum arm’s reach. In experiment 2, the targets varied in depth randomly from one depth plane to another within the same configuration of 13 targets within a trial set, which resembled button selection task in hierarchical menus in differing depth planes in the near-field. The study was conducted using the HTC Vive head-mounted display, and used either a VR controller (HTC Vive), low-fidelity virtual pointing (Leap Motion), or a high-fidelity virtual pointing (tracked VR glove) conditions. Our results revealed that low-fidelity pointing performed worse than both high-fidelity pointing and the VR controller. Overall, target selection performance was found to be worse in depth planes closer to the maximum arms reach, as compared to middle and nearer distances.more » « less