skip to main content

Search for: All records

Award ID contains: 1901236

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Upper-limb amputees commonly cite difficulty of control as one of the main reasons why they abandon their prostheses. Combining myoelectric control with autonomous sensor-based control could improve prosthesis control. However, the cognitive and physical impact of shared control and semi-autonomous systems on users has yet to be fully explored. In this study we introduce a novel shared-control algorithm that blends proportional position control predicted from electromyography (EMG) with proportional position control predicted from an autonomous machine using infrared sensors embedded in the prosthetic hand’s fingers to detect the distance to objects. The user’s EMG control is damped in proportion to the machine’s prediction of an object’s position in relation to a given finger. The shared-control algorithm was validated using three intact individuals completing a holding task where they attempted to hold an object for as long as possible without dropping it. Shared control resulted in fewer object drops, 32% less cognitive demand, and 49% less physical effort (measured by EMG) relative to the participant’s EMG control alone. These results indicate that shared control can reduce the physiological burdens on the user as well as increase prosthetic control.
    Free, publicly-accessible full text available August 1, 2023
  2. null (Ed.)
  3. Intuitive control of prostheses relies on training algorithms to correlate biological recordings to motor intent. The quality of the training dataset is critical to run-time performance, but it is difficult to label hand kinematics accurately after the hand has been amputated. We quantified the accuracy and precision of labeling hand kinematics for two different approaches: 1) assuming a participant is perfectly mimicking predetermined motions of a prosthesis (mimicked training), and 2) assuming a participant is perfectly mirroring their contralateral hand during identical bilateral movements (mirrored training). We compared these approaches in non-amputee individuals, using an infrared camera to track eight different joint angles of the hands in real-time. Aggregate data revealed that mimicked training does not account for biomechanical coupling or temporal changes in hand posture. Mirrored training was significantly more accurate and precise at labeling hand kinematics. However, when training a modified Kalman filter to estimate motor intent, the mimicked and mirrored training approaches were not significantly different. The results suggest that the mirrored training approach creates a more faithful but more complex dataset. Advanced algorithms, more capable of learning the complex mirrored training dataset, may yield better run-time prosthetic control.