skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on June 3, 2025

Title: A Multimodal Assistive-Robotic-Arm Control System to Increase Independence After Tetraplegia
Following tetraplegia, independence for completing essential daily tasks, such as opening doors and eating, significantly declines. Assistive robotic manipulators (ARMs) could restore independence, but typically input devices for these manipulators require functional use of the hands. We created and validated a hands-free multimodal input system for controlling an ARM in virtual reality using combinations of a gyroscope, eye-tracking, and heterologous surface electromyography (sEMG). These input modalities are mapped to ARM functions based on the user’s preferences and to maximize the utility of their residual volitional capabilities following tetraplegia. The two participants in this study with tetraplegia preferred to use the control mapping with sEMG button functions and disliked winking commands. Non-disabled participants were more varied in their preferences and performance, further suggesting that customizability is an advantageous component of the control system. Replacing buttons from a traditional handheld controller with sEMG did not substantively reduce performance. The system provided adequate control to all participants to complete functional tasks in virtual reality such as opening door handles, turning stove dials, eating, and drinking, all of which enable independence and improved quality of life for these individuals.  more » « less
Award ID(s):
1901492
PAR ID:
10516547
Author(s) / Creator(s):
; ; ;
Publisher / Repository:
IEEE
Date Published:
Journal Name:
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Volume:
32
ISSN:
1534-4320
Page Range / eLocation ID:
2124 to 2133
Subject(s) / Keyword(s):
Assistive robotic technology electromyography spinal cord injury usability study
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Myoelectric control of prostheses is a long-established technique, using surface electromyography (sEMG) to detect user intention and perform subsequent mechanical actions. Most machine learning models utilized in control systems are trained using isolated movements that do not reflect the natural movements occurring during daily activities. Moreover, movements are often affected by arm postures, the duration of activities, and personal habits. It is crucial to have a control system for multi-degree-of-freedom (DoF) prosthetic arms that is trained using sEMG data collected from activities of daily living (ADL) tasks. This work focuses on two major functional wrist movements: pronation-supination and dart-throwing movement (DTM), and introduces a new wrist control system that directly maps sEMG signals to the joint velocities of the multi-DoF wrist. Additionally, a specific training strategy (Quick training) is proposed that enables the controller to be applied to new subjects and handle situations where sensors may displace during daily living, muscles can become fatigued, or sensors can become contaminated (e.g., due to sweat). The prosthetic wrist controller is designed based on data from 24 participants and its performance is evaluated using the Root Mean Square Error (RMSE) and Pearson Correlation. The results are found to depend on the characteristics of the tasks. For example, tasks with dart-throwing motion show smaller RSME values (Hammer: 6.68 deg/s and Cup: 7.92 deg/s) compared to tasks with pronation-supination (Bulb: 43.98 deg/s and Screw: 53.64 deg/s). The proposed control technique utilizing Quick training demonstrates a decrease in the average root mean square error (RMSE) value by 35% and an increase in the average Pearson correlation value by 40% across all four ADL tasks. 
    more » « less
  2. Abstract Recent immersive mixed reality (MR) and virtual reality (VR) displays enable users to use their hands to interact with both veridical and virtual environments simultaneously. Therefore, it becomes important to understand the performance of human hand-reaching movement in MR. Studies have shown that different virtual environment visualization modalities can affect point-to-point reaching performance using a stylus, but it is not yet known if these effects translate to direct human-hand interactions in mixed reality. This paper focuses on evaluating human point-to-point motor performance in MR and VR for both finger-pointing and cup-placement tasks. Six performance measures relevant to haptic interface design were measured for both tasks under several different visualization conditions (“MR with indicator,” “MR without indicator,” and “VR”) to determine what factors contribute to hand-reaching performance. A key finding was evidence of a trade-off between reaching “motion confidence” measures (indicated by throughput, number of corrective movements, and peak velocity) and “accuracy” measures (indicated by end-point error and initial movement error). Specifically, we observed that participants tended to be more confident in the “MR without Indicator” condition for finger-pointing tasks. These results contribute critical knowledge to inform the design of VR/MR interfaces based on the application's user performance requirements. 
    more » « less
  3. A primary goal of the Virtual Reality ( VR ) community is to build fully immersive and presence-inducing environments with seamless and natural interactions. To reach this goal, researchers are investigating how to best directly use our hands to interact with a virtual environment using hand tracking. Most studies in this field require participants to perform repetitive tasks. In this article, we investigate if results of such studies translate into a real application and game-like experience. We designed a virtual escape room in which participants interact with various objects to gather clues and complete puzzles. In a between-subjects study, we examine the effects of two input modalities (controllers vs. hand tracking) and two grasping visualizations (continuously tracked hands vs. virtual hands that disappear when grasping) on ownership, realism, efficiency, enjoyment, and presence. Our results show that ownership, realism, enjoyment, and presence increased when using hand tracking compared to controllers. Visualizing the tracked hands during grasps leads to higher ratings in one of our ownership questions and one of our enjoyment questions compared to having the virtual hands disappear during grasps as is common in many applications. We also confirm some of the main results of two studies that have a repetitive design in a more realistic gaming scenario that might be closer to a typical user experience. 
    more » « less
  4. Abstract In recent years, commercially available dexterous upper limb prostheses for children have begun to emerge. These devices derive control signals from surface electromyography (measure of affected muscle electrical activity, sEMG) to drive a variety of grasping motions. However, the ability for children with congenital upper limb deficiency to actuate their affected muscles to achieve naturalistic prosthetic control is not well understood, as compared to adults or children with acquired hand loss. To address this gap, we collected sEMG data from 9 congenital one-handed participants ages 8–20 years as they envisioned and attempted to perform 10 different movements with their missing hands. Seven sEMG electrodes were adhered circumferentially around the participant’s affected and unaffected limbs and participants mirrored the attempted missing hand motions with their intact side. To analyze the collected sEMG data, we used time and frequency domain analyses. We found that for the majority of participants, attempted hand movements produced detectable and consistent muscle activity, and the capacity to achieve this was not dissimilar across the affected and unaffected sides. These data suggest that children with congenital hand absence retain a degree of control over their affected muscles, which has important implications for translating and refining advanced prosthetic control technologies for children. 
    more » « less
  5. Abstract Objective. Brain–machine interfaces (BMIs) have shown promise in extracting upper extremity movement intention from the thoughts of nonhuman primates and people with tetraplegia. Attempts to restore a user’s own hand and arm function have employed functional electrical stimulation (FES), but most work has restored discrete grasps. Little is known about how well FES can control continuous finger movements. Here, we use a low-power brain-controlled functional electrical stimulation (BCFES) system to restore continuous volitional control of finger positions to a monkey with a temporarily paralyzed hand. Approach. We delivered a nerve block to the median, radial, and ulnar nerves just proximal to the elbow to simulate finger paralysis, then used a closed-loop BMI to predict finger movements the monkey was attempting to make in two tasks. The BCFES task was one-dimensional in which all fingers moved together, and we used the BMI’s predictions to control FES of the monkey’s finger muscles. The virtual two-finger task was two-dimensional in which the index finger moved simultaneously and independently from the middle, ring, and small fingers, and we used the BMI’s predictions to control movements of virtual fingers, with no FES. Main results. In the BCFES task, the monkey improved his success rate to 83% (1.5 s median acquisition time) when using the BCFES system during temporary paralysis from 8.8% (9.5 s median acquisition time, equal to the trial timeout) when attempting to use his temporarily paralyzed hand. In one monkey performing the virtual two-finger task with no FES, we found BMI performance (task success rate and completion time) could be completely recovered following temporary paralysis by executing recalibrated feedback-intention training one time. Significance. These results suggest that BCFES can restore continuous finger function during temporary paralysis using existing low-power technologies and brain-control may not be the limiting factor in a BCFES neuroprosthesis. 
    more » « less