skip to main content


Title: Concurrent Prediction of Finger Forces Based on Source Separation and Classification of Neuron Discharge Information
A reliable neural-machine interface is essential for humans to intuitively interact with advanced robotic hands in an unconstrained environment. Existing neural decoding approaches utilize either discrete hand gesture-based pattern recognition or continuous force decoding with one finger at a time. We developed a neural decoding technique that allowed continuous and concurrent prediction of forces of different fingers based on spinal motoneuron firing information. High-density skin-surface electromyogram (HD-EMG) signals of finger extensor muscle were recorded, while human participants produced isometric flexion forces in a dexterous manner (i.e. produced varying forces using either a single finger or multiple fingers concurrently). Motoneuron firing information was extracted from the EMG signals using a blind source separation technique, and each identified neuron was further classified to be associated with a given finger. The forces of individual fingers were then predicted concurrently by utilizing the corresponding motoneuron pool firing frequency of individual fingers. Compared with conventional approaches, our technique led to better prediction performances, i.e. a higher correlation ([Formula: see text] versus [Formula: see text]), a lower prediction error ([Formula: see text]% MVC versus [Formula: see text]% MVC), and a higher accuracy in finger state (rest/active) prediction ([Formula: see text]% versus [Formula: see text]%). Our decoding method demonstrated the possibility of classifying motoneurons for different fingers, which significantly alleviated the cross-talk issue of EMG recordings from neighboring hand muscles, and allowed the decoding of finger forces individually and concurrently. The outcomes offered a robust neural-machine interface that could allow users to intuitively control robotic hands in a dexterous manner.  more » « less
Award ID(s):
1847319
NSF-PAR ID:
10220192
Author(s) / Creator(s):
;
Date Published:
Journal Name:
International Journal of Neural Systems
ISSN:
0129-0657
Page Range / eLocation ID:
2150010
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Objective: Robust neural decoding of intended motor output is crucial to enable intuitive control of assistive devices, such as robotic hands, to perform daily tasks. Few existing neural decoders can predict kinetic and kinematic variables simultaneously. The current study developed a continuous neural decoding approach that can concurrently predict fingertip forces and joint angles of multiple fingers. Methods: We obtained motoneuron firing activities by decomposing high-density electromyogram (HD EMG) signals of the extrinsic finger muscles. The identified motoneurons were first grouped and then refined specific to each finger (index or middle) and task (finger force and dynamic movement) combination. The refined motoneuron groups (separate matrix) were then applied directly to new EMG data in real-time involving both finger force and dynamic movement tasks produced by both fingers. EMG-amplitude-based prediction was also performed as a comparison. Results: We found that the newly developed decoding approach outperformed the EMG-amplitude method for both finger force and joint angle estimations with a lower prediction error (Force: 3.47±0.43 vs 6.64±0.69% MVC, Joint Angle: 5.40±0.50° vs 12.8±0.65°) and a higher correlation (Force: 0.75±0.02 vs 0.66±0.05, Joint Angle: 0.94±0.01 vs 0.5±0.05) between the estimated and recorded motor output. The performance was also consistent for both fingers. Conclusion: The developed neural decoding algorithm allowed us to accurately and concurrently predict finger forces and joint angles of multiple fingers in real-time. Significance: Our approach can enable intuitive interactions with assistive robotic hands, and allow the performance of dexterous hand skills involving both force control tasks and dynamic movement control tasks. 
    more » « less
  2. A reliable and functional neural interface is necessary to control individual finger movements of assistive robotic hands. Non-invasive surface electromyogram (sEMG) can be used to predict fingertip forces and joint kinematics continuously. However, concurrent prediction of kinematic and dynamic variables in a continuous manner remains a challenge. The purpose of this study was to develop a neural decoding algorithm capable of concurrent prediction of fingertip forces and finger dynamic movements. High-density electromyogram (HD-EMG) signal was collected during finger flexion tasks using either the index or middle finger: isometric, dynamic, and combined tasks. Based on the data obtained from the two first tasks, motor unit (MU) firing activities associated with individual fingers and tasks were derived using a blind source separation method. MUs assigned to the same tasks and fingers were pooled together to form MU pools. Twenty MUs were then refined using EMG data of a combined trial. The refined MUs were applied to a testing dataset of the combined task, and were divided into five groups based on the similarity of firing patterns, and the populational discharge frequency was determined for each group. Using the summated firing frequencies obtained from five groups of MUs in a multivariate linear regression model, fingertip forces and joint angles were derived concurrently. The decoding performance was compared to the conventional EMG amplitude-based approach. In both joint angles and fingertip forces, MU-based approach outperformed the EMG amplitude approach with a smaller prediction error (Force: 5.36±0.47 vs 6.89±0.39 %MVC, Joint Angle: 5.0±0.27° vs 12.76±0.40°) and a higher correlation (Force: 0.87±0.05 vs 0.73±0.1, Joint Angle: 0.92±0.05 vs 0.45±0.05) between the predicted and recorded motor output. The outcomes provide a functional and accurate neural interface for continuous control of assistive robotic hands. 
    more » « less
  3. null (Ed.)
    A loss of individuated finger movement affects critical aspects of daily activities. There is a need to develop neural-machine interface techniques that can continuously decode single finger movements. In this preliminary study, we evaluated a novel decoding method that used finger-specific motoneuron firing frequency to estimate joint kinematics and fingertip forces. High-density electromyogram (EMG) signals were obtained during which index or middle fingers produced either dynamic flexion movements or isometric flexion forces. A source separation method was used to extract motor unit (MU) firing activities from a single trial. A separate validation trial was used to only retain the MUs associated with a particular finger. The finger-specific MU firing activities were then used to estimate individual finger joint angles and isometric forces in a third trial using a regression method. Our results showed that the MU firing based approach led to smaller prediction errors for both joint angles and forces compared with the conventional EMG amplitude based method. The outcomes can help develop intuitive neural-machine interface techniques that allow continuous single-finger level control of robotic hands. In addition, the previously obtained MU separation information was applied directly to new data, and it is therefore possible to enable online extraction of MU firing activities for real-time neural-machine interactions. 
    more » « less
  4. null (Ed.)
    Objective: A reliable neural-machine interface offers the possibility of controlling advanced robotic hands with high dexterity. The objective of this study was to develop a decoding method to estimate flexion and extension forces of individual fingers concurrently. Methods: First, motor units (MUs) firing information were identified through surface electromyogram (EMG) decomposition, and the MUs were further categorized into different pools for the flexion and extension of individual fingers via a refinement procedure. MU firing rate at the populational level was calculated, and the individual finger forces were then estimated via a bivariate linear regression model (neural-drive method). Conventional EMG amplitude-based method was used as a comparison. Results: Our results showed that the neural-drive method had a significantly better performance (lower estimation error and higher correlation) compared with the conventional method. Conclusion: Our approach provides a reliable neural decoding method for dexterous finger movements. Significance: Further exploration of our method can potentially provide a robust neural-machine interface for intuitive control of robotic hands. 
    more » « less
  5. Neuromuscular injuries can impair hand function and profoundly impacting the quality of life. This has motivated the development of advanced assistive robotic hands. However, the current neural decoder systems are limited in their ability to provide dexterous control of these robotic hands. In this study, we propose a novel method for predicting the extension and flexion force of three individual fingers concurrently using high-density electromyogram (HD-EMG) signals. Our method employs two deep forest models, the flexor decoder and the extensor decoder, to extract relevant representations from the EMG amplitude features. The outputs of the two decoders are integrated through linear regression to predict the forces of the three fingers. The proposed method was evaluated on data from three subjects and the results showed that it consistently outperforms the conventional EMG amplitude-based approach in terms of prediction error and robustness across both target and non-target fingers. This work presents a promising neural decoding approach for intuitive and dexterous control of the fingertip forces of assistive robotic hands. 
    more » « less