Motion tracking interfaces are intuitive for free-form teleoperation tasks. However, efficient manipulation control can be difficult with such interfaces because of issues like the interference of unintended motions and the limited precision of human motion control. The limitation in control efficiency reduces the operator's performance and increases their workload and frustration during robot teleoperation. To improve the efficiency, we proposed separating controlled degrees of freedom (DoFs) and adjusting the motion scaling ratio of a motion tracking interface. The motion tracking of handheld controllers from a Virtual Reality system was used for the interface. We separated the translation and rotational control into: 1) two controllers held in the dominant and non-dominant hands and 2) hand pose tracking and trackpad inputs of a controller. We scaled the control mapping ratio based on 1) the environmental constraints and 2) the teleoperator's control speed. We further conducted a user study to investigate the effectiveness of the proposed methods in increasing efficiency. Our results show that the separation of position and orientation control into two controllers and the environment-based scaling methods perform better than their alternatives.
more »
« less
Virtual Reality for Evaluating Prosthetic Hand Control Strategies: A Preliminary Report
Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee’s quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.
more »
« less
- PAR ID:
- 10319954
- Date Published:
- Journal Name:
- 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove.more » « less
-
Chen, Jessie Y; Fragomeni, G (Ed.)Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove.more » « less
-
A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience.more » « less
-
In this work, we investigate the influence of different visualizations on a manipulation task in virtual reality (VR). Without the haptic feedback of the real world, grasping in VR might result in intersections with virtual objects. As people are highly sensitive when it comes to perceiving collisions, it might look more appealing to avoid intersections and visualize non-colliding hand motions. However, correcting the position of the hand or fingers results in a visual-proprioceptive discrepancy and must be used with caution. Furthermore, the lack of haptic feedback in the virtual world might result in slower actions as a user might not know exactly when a grasp has occurred. This reduced performance could be remediated with adequate visual feedback. In this study, we analyze the performance, level of ownership, and user preference of eight different visual feedback techniques for virtual grasping. Three techniques show the tracked hand (with or without grasping feedback), even if it intersects with the grasped object. Another three techniques display a hand without intersections with the object, called outer hand, simulating the look of a real world interaction. One visualization is a compromise between the two groups, showing both a primary outer hand and a secondary tracked hand. Finally, in the last visualization the hand disappears during the grasping activity. In an experiment, users perform a pick-and-place task for each feedback technique. We use high fidelity marker-based hand tracking to control the virtual hands in real time. We found that the tracked hand visualizations result in better performance, however, the outer hand visualizations were preferred. We also find indications that ownership is higher with the outer hand visualizations.more » « less