skip to main content

Title: Concurrent Decoding of Finger Kinematic and Kinetic Variables based on Motor Unit Discharges
A reliable and functional neural interface is necessary to control individual finger movements of assistive robotic hands. Non-invasive surface electromyogram (sEMG) can be used to predict fingertip forces and joint kinematics continuously. However, concurrent prediction of kinematic and dynamic variables in a continuous manner remains a challenge. The purpose of this study was to develop a neural decoding algorithm capable of concurrent prediction of fingertip forces and finger dynamic movements. High-density electromyogram (HD-EMG) signal was collected during finger flexion tasks using either the index or middle finger: isometric, dynamic, and combined tasks. Based on the data obtained from the two first tasks, motor unit (MU) firing activities associated with individual fingers and tasks were derived using a blind source separation method. MUs assigned to the same tasks and fingers were pooled together to form MU pools. Twenty MUs were then refined using EMG data of a combined trial. The refined MUs were applied to a testing dataset of the combined task, and were divided into five groups based on the similarity of firing patterns, and the populational discharge frequency was determined for each group. Using the summated firing frequencies obtained from five groups of MUs in a multivariate linear regression model, fingertip forces and joint angles were derived concurrently. The decoding performance was compared to the conventional EMG amplitude-based approach. In both joint angles and fingertip forces, MU-based approach outperformed the EMG amplitude approach with a smaller prediction error (Force: 5.36±0.47 vs 6.89±0.39 %MVC, Joint Angle: 5.0±0.27° vs 12.76±0.40°) and a higher correlation (Force: 0.87±0.05 vs 0.73±0.1, Joint Angle: 0.92±0.05 vs 0.45±0.05) between the predicted and recorded motor output. The outcomes provide a functional and accurate neural interface for continuous control of assistive robotic hands.  more » « less
Award ID(s):
1847319 2330862
Author(s) / Creator(s):
; ;
Date Published:
Journal Name:
2022 IEEE 3rd International Conference on Human-Machine Systems (ICHMS)
Page Range / eLocation ID:
1 to 4
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Objective: Robust neural decoding of intended motor output is crucial to enable intuitive control of assistive devices, such as robotic hands, to perform daily tasks. Few existing neural decoders can predict kinetic and kinematic variables simultaneously. The current study developed a continuous neural decoding approach that can concurrently predict fingertip forces and joint angles of multiple fingers. Methods: We obtained motoneuron firing activities by decomposing high-density electromyogram (HD EMG) signals of the extrinsic finger muscles. The identified motoneurons were first grouped and then refined specific to each finger (index or middle) and task (finger force and dynamic movement) combination. The refined motoneuron groups (separate matrix) were then applied directly to new EMG data in real-time involving both finger force and dynamic movement tasks produced by both fingers. EMG-amplitude-based prediction was also performed as a comparison. Results: We found that the newly developed decoding approach outperformed the EMG-amplitude method for both finger force and joint angle estimations with a lower prediction error (Force: 3.47±0.43 vs 6.64±0.69% MVC, Joint Angle: 5.40±0.50° vs 12.8±0.65°) and a higher correlation (Force: 0.75±0.02 vs 0.66±0.05, Joint Angle: 0.94±0.01 vs 0.5±0.05) between the estimated and recorded motor output. The performance was also consistent for both fingers. Conclusion: The developed neural decoding algorithm allowed us to accurately and concurrently predict finger forces and joint angles of multiple fingers in real-time. Significance: Our approach can enable intuitive interactions with assistive robotic hands, and allow the performance of dexterous hand skills involving both force control tasks and dynamic movement control tasks. 
    more » « less
  2. null (Ed.)
    A loss of individuated finger movement affects critical aspects of daily activities. There is a need to develop neural-machine interface techniques that can continuously decode single finger movements. In this preliminary study, we evaluated a novel decoding method that used finger-specific motoneuron firing frequency to estimate joint kinematics and fingertip forces. High-density electromyogram (EMG) signals were obtained during which index or middle fingers produced either dynamic flexion movements or isometric flexion forces. A source separation method was used to extract motor unit (MU) firing activities from a single trial. A separate validation trial was used to only retain the MUs associated with a particular finger. The finger-specific MU firing activities were then used to estimate individual finger joint angles and isometric forces in a third trial using a regression method. Our results showed that the MU firing based approach led to smaller prediction errors for both joint angles and forces compared with the conventional EMG amplitude based method. The outcomes can help develop intuitive neural-machine interface techniques that allow continuous single-finger level control of robotic hands. In addition, the previously obtained MU separation information was applied directly to new data, and it is therefore possible to enable online extraction of MU firing activities for real-time neural-machine interactions. 
    more » « less
  3. null (Ed.)
    Objective: A reliable neural-machine interface offers the possibility of controlling advanced robotic hands with high dexterity. The objective of this study was to develop a decoding method to estimate flexion and extension forces of individual fingers concurrently. Methods: First, motor units (MUs) firing information were identified through surface electromyogram (EMG) decomposition, and the MUs were further categorized into different pools for the flexion and extension of individual fingers via a refinement procedure. MU firing rate at the populational level was calculated, and the individual finger forces were then estimated via a bivariate linear regression model (neural-drive method). Conventional EMG amplitude-based method was used as a comparison. Results: Our results showed that the neural-drive method had a significantly better performance (lower estimation error and higher correlation) compared with the conventional method. Conclusion: Our approach provides a reliable neural decoding method for dexterous finger movements. Significance: Further exploration of our method can potentially provide a robust neural-machine interface for intuitive control of robotic hands. 
    more » « less
  4. Background: Myoelectric-based decoding has gained popularity in upper-limb neural-machine interfaces. Motor unit (MU) firings decomposed from surface electromyographic (EMG) signals can represent motor intent, but EMG properties at different arm configurations can change due to electrode shift and differing neuromuscular states. This study investigated whether isometric fingertip force estimation using MU firings is robust to forearm rotations from a neutral to either a fully pronated or supinated posture. Methods: We extracted MU information from high-density EMG of the extensor digitorum communis in two ways: (1) Decomposed EMG in all three postures (MU-AllPost); and (2) Decomposed EMG in neutral posture (MU-Neu), and extracted MUs (separation matrix) were applied to other postures. Populational MU firing frequency estimated forces scaled to subjects’ maximum voluntary contraction (MVC) using a regression analysis. The results were compared with the conventional EMG-amplitude method. Results: We found largely similar root-mean-square errors (RMSE) for the two MU-methods, indicating that MU decomposition was robust to postural differences. MU-methods demonstrated lower RMSE in the ring (EMG = 6.23, MU-AllPost = 5.72, MU-Neu = 5.64 %MVC) and pinky (EMG = 6.12, MU-AllPost = 4.95, MU-Neu = 5.36 %MVC) fingers, with mixed results in the middle finger (EMG = 5.47, MU-AllPost = 5.52, MU-Neu = 6.19% MVC). Conclusion: Our results suggest that MU firings can be extracted reliably with little influence from forearm posture, highlighting its potential as an alternative decoding scheme for robust and continuous control of assistive devices. 
    more » « less
  5. null (Ed.)
    A reliable neural-machine interface is essential for humans to intuitively interact with advanced robotic hands in an unconstrained environment. Existing neural decoding approaches utilize either discrete hand gesture-based pattern recognition or continuous force decoding with one finger at a time. We developed a neural decoding technique that allowed continuous and concurrent prediction of forces of different fingers based on spinal motoneuron firing information. High-density skin-surface electromyogram (HD-EMG) signals of finger extensor muscle were recorded, while human participants produced isometric flexion forces in a dexterous manner (i.e. produced varying forces using either a single finger or multiple fingers concurrently). Motoneuron firing information was extracted from the EMG signals using a blind source separation technique, and each identified neuron was further classified to be associated with a given finger. The forces of individual fingers were then predicted concurrently by utilizing the corresponding motoneuron pool firing frequency of individual fingers. Compared with conventional approaches, our technique led to better prediction performances, i.e. a higher correlation ([Formula: see text] versus [Formula: see text]), a lower prediction error ([Formula: see text]% MVC versus [Formula: see text]% MVC), and a higher accuracy in finger state (rest/active) prediction ([Formula: see text]% versus [Formula: see text]%). Our decoding method demonstrated the possibility of classifying motoneurons for different fingers, which significantly alleviated the cross-talk issue of EMG recordings from neighboring hand muscles, and allowed the decoding of finger forces individually and concurrently. The outcomes offered a robust neural-machine interface that could allow users to intuitively control robotic hands in a dexterous manner. 
    more » « less