skip to main content

Title: Virtual Reality for Evaluating Prosthetic Hand Control Strategies: A Preliminary Report
Improving prosthetic hand functionality is critical in reducing abandonment rates and improving the amputee’s quality of life. Techniques such as joint force estimation and gesture recognition using myoelectric signals could enable more realistic control of the prosthetic hand. To accelerate the translation of these advanced control strategies from lab to clinic, We created a virtual prosthetic control environment that enables rich user interactions and dexterity evaluation. The virtual environment is made of two parts, namely the Unity scene for rendering and user interaction, and a Python back-end to support accurate physics simulation and communication with control algorithms. By utilizing the built-in tracking capabilities of a virtual reality headset, the user can visualize and manipulate a virtual hand without additional motion tracking setups. In the virtual environment, we demonstrate actuation of the prosthetic hand through decoded EMG signal streaming, hand tracking, and the use of a VR controller. By providing a flexible platform to investigate different control modalities, we believe that our virtual environment will allow for faster experimentation and further progress in clinical translation.  more » « less
Award ID(s):
1847319 2106747
Author(s) / Creator(s):
Date Published:
Journal Name:
2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC)
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Motion tracking interfaces are intuitive for free-form teleoperation tasks. However, efficient manipulation control can be difficult with such interfaces because of issues like the interference of unintended motions and the limited precision of human motion control. The limitation in control efficiency reduces the operator's performance and increases their workload and frustration during robot teleoperation. To improve the efficiency, we proposed separating controlled degrees of freedom (DoFs) and adjusting the motion scaling ratio of a motion tracking interface. The motion tracking of handheld controllers from a Virtual Reality system was used for the interface. We separated the translation and rotational control into: 1) two controllers held in the dominant and non-dominant hands and 2) hand pose tracking and trackpad inputs of a controller. We scaled the control mapping ratio based on 1) the environmental constraints and 2) the teleoperator's control speed. We further conducted a user study to investigate the effectiveness of the proposed methods in increasing efficiency. Our results show that the separation of position and orientation control into two controllers and the environment-based scaling methods perform better than their alternatives. 
    more » « less
  2. A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience. 
    more » « less
  3. null (Ed.)
    Problem-solving focuses on defining and analyzing problems, then finding viable solutions through an iterative process that requires brainstorming and understanding of what is known and what is unknown in the problem space. With rapid changes of economic landscape in the United States, new types of jobs emerge when new industries are created. Employers report that problem-solving is the most important skill they are looking for in job applicants. However, there are major concerns about the lack of problem-solving skills in engineering students. This lack of problem-solving skills calls for an approach to measure and enhance these skills. In this research, we propose to understand and improve problem-solving skills in engineering education by integrating eye-tracking sensing with virtual reality (VR) manufacturing. First, we simulate a manufacturing system in a VR game environment that we call a VR learning factory. The VR learning factory is built in the Unity game engine with the HTC Vive VR system for navigation and motion tracking. The headset is custom-fitted with Tobii eye-tracking technology, allowing the system to identify the coordinates and objects that a user is looking at, at any given time during the simulation. In the environment, engineering students can see through the headset a virtual manufacturing environment composed of a series of workstations and are able to interact with workpieces in the virtual environment. For example, a student can pick up virtual plastic bricks and assemble them together using the wireless controller in hand. Second, engineering students are asked to design and assemble car toys that satisfy predefined customer requirements while minimizing the total cost of production. Third, data-driven models are developed to analyze eye-movement patterns of engineering students. For instance, problem-solving skills are measured by the extent to which the eye-movement patterns of engineering students are similar to the pattern of a subject matter expert (SME), an ideal person who sets the expert criterion for the car toy assembly process. Benchmark experiments are conducted with a comprehensive measure of performance metrics such as cycle time, the number of station switches, weight, price, and quality of car toys. Experimental results show that eye-tracking modeling is efficient and effective to measure problem-solving skills of engineering students. The proposed VR learning factory was integrated into undergraduate manufacturing courses to enhance student learning and problem-solving skills. 
    more » « less
  4. In this work, we investigate the influence of different visualizations on a manipulation task in virtual reality (VR). Without the haptic feedback of the real world, grasping in VR might result in intersections with virtual objects. As people are highly sensitive when it comes to perceiving collisions, it might look more appealing to avoid intersections and visualize non-colliding hand motions. However, correcting the position of the hand or fingers results in a visual-proprioceptive discrepancy and must be used with caution. Furthermore, the lack of haptic feedback in the virtual world might result in slower actions as a user might not know exactly when a grasp has occurred. This reduced performance could be remediated with adequate visual feedback. In this study, we analyze the performance, level of ownership, and user preference of eight different visual feedback techniques for virtual grasping. Three techniques show the tracked hand (with or without grasping feedback), even if it intersects with the grasped object. Another three techniques display a hand without intersections with the object, called outer hand, simulating the look of a real world interaction. One visualization is a compromise between the two groups, showing both a primary outer hand and a secondary tracked hand. Finally, in the last visualization the hand disappears during the grasping activity. In an experiment, users perform a pick-and-place task for each feedback technique. We use high fidelity marker-based hand tracking to control the virtual hands in real time. We found that the tracked hand visualizations result in better performance, however, the outer hand visualizations were preferred. We also find indications that ownership is higher with the outer hand visualizations. 
    more » « less
  5. Although beginning to emerge, multiarticulate upper limb prostheses for children remain sparse despite the continued advancement of mechatronic technologies that have benefited adults with upper limb amputations. Upper limb prosthesis research is primarily focused on adults, even though rates of pediatric prosthetic abandonment far surpass those seen in adults. The implicit goal of a prosthesis is to provide effective functionality while promoting healthy social interaction. Yet most current pediatric devices offer a single degree of freedom open/close grasping function, a stark departure from the multiple grasp configurations provided in advanced adult devices. Although comparable child-sized devices are on the clinical horizon, understanding how to effectively translate these technologies to the pediatric population is vital. This includes exploring grasping movements that may provide the most functional benefits and techniques to control the newly available dexterity. Currently, no dexterous pediatric research platforms exist that offer open access to hardware and programming to facilitate the investigation and provision of multi-grasp function. Our objective was to deliver a child-sized multi-grasp prosthesis that may serve as a robust research platform. In anticipation of an open-source release, we performed a comprehensive set of benchtop and functional tests with common household objects to quantify the performance of our device. This work discusses and evaluates our pediatric-sized multiarticulate prosthetic hand that provides 6 degrees of actuation, weighs 177 g and was designed specifically for ease of implementation in a research or clinical-research setting. Through the benchtop and validated functional tests, the pediatric hand produced grasping forces ranging from 0.424–7.216 N and was found to be comparable to the functional capabilities of similar adult devices. As mechatronic technologies advance and multiarticulate prostheses continue to evolve, translating many of these emerging technologies may help provide children with more useful and functional prosthesis options. Effective translation will inevitably require a solid scientific foundation to inform how best to prescribe advanced prosthetic devices and control systems for children. This work begins addressing these current gaps by providing a much-needed research platform with supporting data to facilitate its use in laboratory and clinical research settings. 
    more » « less