skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Friday, December 13 until 2:00 AM ET on Saturday, December 14 due to maintenance. We apologize for the inconvenience.


Title: How Important are Detailed Hand Motions for Communication for a Virtual Character Through the Lens of Charades?
Detailed hand motions play an important role in face-to-face communication to emphasize points, describe objects, clarify concepts, or replace words altogether. While shared virtual reality (VR) spaces are becoming more popular, these spaces do not, in most cases, capture and display accurate hand motions. In this paper, we investigate the consequences of such errors in hand and finger motions on comprehension, character perception, social presence, and user comfort. We conduct three perceptual experiments where participants guess words and movie titles based on motion captured movements. We introduce errors and alterations to the hand movements and apply techniques to synthesize or correct hand motions. We collect data from more than 1000 Amazon Mechanical Turk participants in two large experiments, and conduct a third experiment in VR. As results might differ depending on the virtual character used, we investigate all effects on two virtual characters of different levels of realism. We furthermore investigate the effects of clip length in our experiments. Amongst other results, we show that the absence of finger motion significantly reduces comprehension and negatively affects people’s perception of a virtual character and their social presence. Adding some hand motions, even random ones, does attenuate some of these effects when it comes to the perception of the virtual character or social presence, but it does not necessarily improve comprehension. Slightly inaccurate or erroneous hand motions are sufficient to achieve the same level of comprehension as with accurate hand motions. They might however still affect the viewers’ impression of a character. Finally, jittering hand motions should be avoided as they significantly decrease user comfort.  more » « less
Award ID(s):
1652210
PAR ID:
10394435
Author(s) / Creator(s):
; ; ; ; ; ;
Date Published:
Journal Name:
ACM Transactions on Graphics
ISSN:
0730-0301
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Recent developments in the commercialization of virtual reality open up many opportunities for enhancing human interaction with three-dimensional objects and visualizations. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. In a lab study, we investigate the effects of the perception of actually holding a virtual spherical visualization in hands. As use cases, we focus on surface visualizations that benefit from or require a rounded shape. We compared the usage of two differently sized acrylic glass spheres to a related interaction technique that utilizes VR controllers as proxies. On the one hand, our work is motivated by the ability to create in VR a tangible, lightweight, handheld spherical display that can hardly be realized in reality. On the other hand, gaining insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior is important in its own right. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection. 
    more » « less
  2. Abstract

    In virtual reality (VR), established perception–action relationships break down because of conflicting and ambiguous sensorimotor inputs, inducing walking velocity underestimations. Here, we explore the effects of realigning perceptual sensory experiences with physical movements via augmented feedback on the estimation of virtual speed. We hypothesized that providing feedback about speed would lead to concurrent perceptual improvements and that these alterations would persist once the speedometer was removed. Ten young adults used immersive VR to view a virtual hallway translating at a series of fixed speeds. Participants were tasked with matching their walking speed on a self-paced treadmill to the optic flow in the environment. Information regarding walking speed accuracy was provided during augmented feedback trials via a real-time speedometer. We measured resulting walking velocity errors, as well as kinematic gait parameters. We found that the concordance between the virtual environment and gait speeds was higher when augmented feedback was provided during the trial. Furthermore, we observed retention effects beyond the intervention period via demonstrated smaller errors in speed perception accuracy and stronger concordance between perceived and actual speeds. Together, these results highlight a potential role for augmented feedback in guiding gait strategies that deviate away from predefined internal models of locomotion.

     
    more » « less
  3. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  4. Computer-mediated collaboration has long been a core research interest in CSCW and HCI. As online social spaces continue to evolve towards more immersive and higher fidelity experiences, more research is still needed to investigate how emerging novel technology may foster and support new and more nuanced forms and experiences of collaboration in virtual environments. Using 30 interviews, this paper focuses on what people may collaborate on and how they collaborate in social Virtual Reality (VR). We broaden current studies on computer-mediated collaboration by highlighting the importance of embodiment for co-presence and communication, replicating offline collaborative activities, and supporting the seamless interplay of work, play, and mundane experiences in everyday lives for experiencing and conceptualizing collaboration in emerging virtual environments. We also propose potential design implications that could further support everyday collaborative activities in social VR 
    more » « less
  5. The goal of this research is to provide much needed empirical data on how the fidelity of popular hand gesture tracked based pointing metaphors versus commodity controller based input affects the efficiency and speed-accuracy tradeoff in users’ spatial selection in personal space interactions in VR. We conduct two experiments in which participants select spherical targets arranged in a circle in personal space, or near-field within their maximum arms reach distance, in VR. Both experiments required participants to select the targets with either a VR controller or with their dominant hand’s index finger, which was tracked with one of two popular contemporary tracking methods. In the first experiment, the targets are arranged in a flat circle in accordance with the ISO 9241-9 Fitts’ law standard, and the simulation selected random combinations of 3 target amplitudes and 3 target widths. Targets were placed centered around the users’ eye level, and the arrangement was placed at either 60%, 75%, or 90% depth plane of the users’ maximum arm’s reach. In experiment 2, the targets varied in depth randomly from one depth plane to another within the same configuration of 13 targets within a trial set, which resembled button selection task in hierarchical menus in differing depth planes in the near-field. The study was conducted using the HTC Vive head-mounted display, and used either a VR controller (HTC Vive), low-fidelity virtual pointing (Leap Motion), or a high-fidelity virtual pointing (tracked VR glove) conditions. Our results revealed that low-fidelity pointing performed worse than both high-fidelity pointing and the VR controller. Overall, target selection performance was found to be worse in depth planes closer to the maximum arms reach, as compared to middle and nearer distances. 
    more » « less