skip to main content


This content will become publicly available on June 7, 2024

Title: The impact of task context on predicting finger movements in a brain-machine interface
A key factor in the clinical translation of brain-machine interfaces (BMIs) for restoring hand motor function will be their robustness to changes in a task. With functional electrical stimulation (FES) for example, the patient’s own hand will be used to produce a wide range of forces in otherwise similar movements. To investigate the impact of task changes on BMI performance, we trained two rhesus macaques to control a virtual hand with their physical hand while we added springs to each finger group (index or middle-ring-small) or altered their wrist posture. Using simultaneously recorded intracortical neural activity, finger positions, and electromyography, we found that decoders trained in one context did not generalize well to other contexts, leading to significant increases in prediction error, especially for muscle activations. However, with respect to online BMI control of the virtual hand, changing either the decoder training task context or the hand’s physical context during online control had little effect on online performance. We explain this dichotomy by showing that the structure of neural population activity remained similar in new contexts, which could allow for fast adjustment online. Additionally, we found that neural activity shifted trajectories proportional to the required muscle activation in new contexts. This shift in neural activity possibly explains biases to off-context kinematic predictions and suggests a feature that could help predict different magnitude muscle activations while producing similar kinematics.  more » « less
Award ID(s):
1926576
NSF-PAR ID:
10450462
Author(s) / Creator(s):
; ; ; ; ; ; ; ; ;
Date Published:
Journal Name:
eLife
Volume:
12
ISSN:
2050-084X
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Objective.While brain–machine interfaces (BMIs) are promising technologies that could provide direct pathways for controlling the external world and thus regaining motor capabilities, their effectiveness is hampered by decoding errors. Previous research has demonstrated the detection and correction of BMI outcome errors, which occur at the end of trials. Here we focus on continuous detection and correction of BMI execution errors, which occur during real-time movements.Approach.Two adult male rhesus macaques were implanted with Utah arrays in the motor cortex. The monkeys performed single or two-finger group BMI tasks where a Kalman filter decoded binned spiking-band power into intended finger kinematics. Neural activity was analyzed to determine how it depends not only on the kinematics of the fingers, but also on the distance of each finger-group to its target. We developed a method to detect erroneous movements, i.e. consistent movements away from the target, from the same neural activity used by the Kalman filter. Detected errors were corrected by a simple stopping strategy, and the effect on performance was evaluated.Mainresults.First we show that including distance to target explains significantly more variance of the recorded neural activity. Then, for the first time, we demonstrate that neural activity in motor cortex can be used to detect execution errors during BMI controlled movements. Keeping false positive rate below5%, it was possible to achieve mean true positive rate of28.1%online. Despite requiring 200 ms to detect and react to suspected errors, we were able to achieve a significant improvement in task performance via reduced orbiting time of one finger group.Significance.Neural activity recorded in motor cortex for BMI control can be used to detect and correct BMI errors and thus to improve performance. Further improvements may be obtained by enhancing classification and correction strategies.

     
    more » « less
  2. Abstract Objective. Brain–machine interfaces (BMIs) have shown promise in extracting upper extremity movement intention from the thoughts of nonhuman primates and people with tetraplegia. Attempts to restore a user’s own hand and arm function have employed functional electrical stimulation (FES), but most work has restored discrete grasps. Little is known about how well FES can control continuous finger movements. Here, we use a low-power brain-controlled functional electrical stimulation (BCFES) system to restore continuous volitional control of finger positions to a monkey with a temporarily paralyzed hand. Approach. We delivered a nerve block to the median, radial, and ulnar nerves just proximal to the elbow to simulate finger paralysis, then used a closed-loop BMI to predict finger movements the monkey was attempting to make in two tasks. The BCFES task was one-dimensional in which all fingers moved together, and we used the BMI’s predictions to control FES of the monkey’s finger muscles. The virtual two-finger task was two-dimensional in which the index finger moved simultaneously and independently from the middle, ring, and small fingers, and we used the BMI’s predictions to control movements of virtual fingers, with no FES. Main results. In the BCFES task, the monkey improved his success rate to 83% (1.5 s median acquisition time) when using the BCFES system during temporary paralysis from 8.8% (9.5 s median acquisition time, equal to the trial timeout) when attempting to use his temporarily paralyzed hand. In one monkey performing the virtual two-finger task with no FES, we found BMI performance (task success rate and completion time) could be completely recovered following temporary paralysis by executing recalibrated feedback-intention training one time. Significance. These results suggest that BCFES can restore continuous finger function during temporary paralysis using existing low-power technologies and brain-control may not be the limiting factor in a BCFES neuroprosthesis. 
    more » « less
  3. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  4. Separating neural signals from noise can improve brain-computer interface performance and stability. However, most algorithms for separating neural action potentials from noise are not suitable for use in real time and have shown mixed effects on decoding performance. With the goal of removing noise that impedes online decoding, we sought to automate the intuition of human spike-sorters to operate in real time with an easily tunable parameter governing the stringency with which spike waveforms are classified. We trained an artificial neural network with one hidden layer on neural waveforms that were hand-labeled as either spikes or noise. The network output was a likelihood metric for each waveform it classified, and we tuned the network’s stringency by varying the minimum likelihood value for a waveform to be considered a spike. Using the network’s labels to exclude noise waveforms, we decoded remembered target location during a memory-guided saccade task from electrode arrays implanted in prefrontal cortex of rhesus macaque monkeys. The network classified waveforms in real time, and its classifications were qualitatively similar to those of a human spike-sorter. Compared with decoding with threshold crossings, in most sessions we improved decoding performance by removing waveforms with low spike likelihood values. Furthermore, decoding with our network’s classifications became more beneficial as time since array implantation increased. Our classifier serves as a feasible preprocessing step, with little risk of harm, that could be applied to both off-line neural data analyses and online decoding. NEW & NOTEWORTHY Although there are many spike-sorting methods that isolate well-defined single units, these methods typically involve human intervention and have inconsistent effects on decoding. We used human classified neural waveforms as training data to create an artificial neural network that could be tuned to separate spikes from noise that impaired decoding. We found that this network operated in real time and was suitable for both off-line data processing and online decoding. 
    more » « less
  5. Physical interaction with tools is ubiquitous in functional activities of daily living. While tool use is considered a hallmark of human behavior, how humans control such physical interactions is still poorly understood. When humans perform a motor task, it is commonly suggested that the central nervous system coordinates the musculo-skeletal system to minimize muscle effort. In this paper, we tested if this notion holds true for motor tasks that involve physical interaction. Specifically, we investigated whether humans minimize muscle forces to control physical interaction with a circular kinematic constraint. Using a simplified arm model, we derived three predictions for how humans should behave if they were minimizing muscular effort to perform the task. First, we predicted that subjects would exert workless, radial forces on the constraint. Second, we predicted that the muscles would be deactivated when they could not contribute to work. Third, we predicted that when moving very slowly along the constraint, the pattern of muscle activity would not differ between clockwise (CW) and counterclockwise (CCW) motions. To test these predictions, we instructed human subjects to move a robot handle around a virtual, circular constraint at a constant tangential velocity. To reduce the effect of forces that might arise from incomplete compensation of neuro-musculo-skeletal dynamics, the target tangential speed was set to an extremely slow pace (~1 revolution every 13.3 seconds). Ultimately, the results of human experiment did not support the predictions derived from our model of minimizing muscular effort. While subjects did exert workless forces, they did not deactivate muscles as predicted. Furthermore, muscle activation patterns differed between CW and CCW motions about the constraint. These findings demonstrate that minimizing muscle effort is not a significant factor in human performance of this constrained-motion task. Instead, the central nervous system likely prioritizes reducing other costs, such as computational effort, over muscle effort to control physical interactions. 
    more » « less