skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Virtual Reality for Evaluating Prosthetic Hand Control Strategies: A Preliminary Report
Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee’s quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.  more » « less
Award ID(s):
1847319 2106747
PAR ID:
10319954
Author(s) / Creator(s):
;
Date Published:
Journal Name:
2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Motion tracking interfaces are intuitive for free-form teleoperation tasks. However, efficient manipulation control can be difficult with such interfaces because of issues like the interference of unintended motions and the limited precision of human motion control. The limitation in control efficiency reduces the operator's performance and increases their workload and frustration during robot teleoperation. To improve the efficiency, we proposed separating controlled degrees of freedom (DoFs) and adjusting the motion scaling ratio of a motion tracking interface. The motion tracking of handheld controllers from a Virtual Reality system was used for the interface. We separated the translation and rotational control into: 1) two controllers held in the dominant and non-dominant hands and 2) hand pose tracking and trackpad inputs of a controller. We scaled the control mapping ratio based on 1) the environmental constraints and 2) the teleoperator's control speed. We further conducted a user study to investigate the effectiveness of the proposed methods in increasing efficiency. Our results show that the separation of position and orientation control into two controllers and the environment-based scaling methods perform better than their alternatives. 
    more » « less
  2. Abstract Electromyogram (EMG)-controlled prosthetic hands have advanced significantly during the past two decades. However, most of the currently available prosthetic hands fail to replicate human hand functionality and controllability. To measure the emulation of the human hand by a prosthetic hand, it is important to evaluate the functional characteristics. Moreover, incorporating feedback from end users during clinical testing is crucial for the precise assessment of a prosthetic hand. The work reported in this manuscript unfolds the functional characteristics of an EMG-CoNtrolled PRosthetIC Hand called ENRICH. ENRICH is a real-time EMG controlled prosthetic hand that can grasp objects in 250.8$$ \pm $$1.1 ms, fulfilling the neuromuscular constraint of a human hand. ENRICH is evaluated in comparison to 26 laboratory prototypes and 10 commercial variants of prosthetic hands. The hand was evaluated in terms of size, weight, operation time, weight lifting capacity, finger joint range of motion, control strategy, degrees of freedom, grasp force, and clinical testing. The box and block test and pick and place test showed ENRICH’s functionality and controllability. The functional evaluation reveals that ENRICH has the potential to restore functionality to hand amputees, improving their quality of life. 
    more » « less
  3. Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove. 
    more » « less
  4. Chen, Jessie Y; Fragomeni, G (Ed.)
    Numerous applications of Virtual Reality (VR) and Augmented Reality (AR) continue to emerge. However, many of the current mechanisms to provide input in those environments still require the user to perform actions (e.g., press a number of buttons, tilt a stick) that are not natural or intuitive. It would be desirable to enable users of 3D virtual environments to use natural hand gestures to interact with the environments. The implementation of a glove capable of tracking the movement and configuration of a user’s hand has been pursued by multiple groups in the past. One of the most recent approaches consists of tracking the motion of the hand and fingers using miniature sensor modules with magnetic and inertial sensors. Unfortunately, the limited quality of the signals from those sensors and the frequent deviation from the assumptions made in the design of their operations have prevented the implementation of a tracking glove able to achieve high performance and large-scale acceptance. This paper describes our development of a proof-of-concept glove that incorporates motion sensors and a signal processing algorithm designed to maintain high tracking performance even in locations that are challenging to these sensors, (e.g., where the geomagnetic field is distorted by nearby ferromagnetic objects). We describe the integration of the required components, the rationale and outline of the tracking algorithms and the virtual reality environment in which the tracking results drive the movements of the model of a hand. We also describe the protocol that will be used to evaluate the performance of the glove. 
    more » « less
  5. A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience. 
    more » « less