skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Neuromorphic vision and tactile fusion for upper limb prosthesis control
A major issue with upper limb prostheses is the disconnect between sensory information perceived by the user and the information perceived by the prosthesis. Advances in prosthetic technology introduced tactile information for monitoring grasping activity, but visual information, a vital component in the human sensory system, is still not fully utilized as a form of feedback to the prosthesis. For able-bodied individuals, many of the decisions for grasping or manipulating an object, such as hand orientation and aperture, are made based on visual information before contact with the object. We show that inclusion of neuromorphic visual information, combined with tactile feedback, improves the ability and efficiency of both able-bodied and amputee subjects to pick up and manipulate everyday objects.We discovered that combining both visual and tactile information in a real-time closed loop feedback strategy generally decreased the completion time of a task involving picking up and manipulating objects compared to using a single modality for feedback. While the full benefit of the combined feedback was partially obscured by experimental inaccuracies of the visual classification system, we demonstrate that this fusion of neuromorphic signals from visual and tactile sensors can provide valuable feedback to a prosthetic arm for enhancing real-time function and usability.  more » « less
Award ID(s):
1849417
PAR ID:
10277421
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
IEEE Transactions on Biomedical Circuits and Systems
Page Range / eLocation ID:
981 to 984
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    The compliant nature of soft fingers allows for safe and dexterous manipulation of objects by humans in an unstructured environment. A soft prosthetic finger design with tactile sensing capabilities for texture discrimination and subsequent sensory stimulation has the potential to create a more natural experience for an amputee. In this work, a pneumatically actuated soft biomimetic finger is integrated with a textile neuromorphic tactile sensor array for a texture discrimination task. The tactile sensor outputs were converted into neuromorphic spike trains, which emulate the firing pattern of biological mechanoreceptors. Spike-based features from each taxel compressed the information and were then used as inputs for the support vector machine classifier to differentiate the textures. Our soft biomimetic finger with neuromorphic encoding was able to achieve an average overall classification accuracy of 99.57% over 16 independent parameters when tested on 13 standardized textured surfaces. The 16 parameters were the combination of 4 angles of flexion of the soft finger and 4 speeds of palpation. To aid in the perception of more natural objects and their manipulation, subjects were provided with transcutaneous electrical nerve stimulation to convey a subset of four textures with varied textural information. Three able-bodied subjects successfully distinguished two or three textures with the applied stimuli. This work paves the way for a more human-like prosthesis through a soft biomimetic finger with texture discrimination capabilities using neuromorphic techniques that provide sensory feedback; furthermore, texture feedback has the potential to enhance user experience when interacting with their surroundings. 
    more » « less
  2. Abstract The mechanoreceptors of the human tactile sensory system contribute to natural grasping manipulations in everyday life. However, in the case of robot systems, attempts to emulate humans’ dexterity are still limited by tactile sensory feedback. In this work, a soft optical lightguide is applied as an afferent nerve fiber in a tactile sensory system. A skin‐like soft silicone material is combined with a bristle friction model, which is capable of fast and easy fabrication. Due to this novel design, the soft sensor can provide not only normal force (up to 5 Newtons) but also lateral force information generated by stick‐slip processes. Through a static force test and slip motion test, its ability to measure normal forces and to detect stick‐slip events is demonstrated. Finally, using a robotic gripper, real‐time control applications are investigated where the sensor helps the gripper apply sufficient force to grasp objects without slipping. 
    more » « less
  3. We describe use of a bidirectional neuromyoelectric prosthetic hand that conveys biomimetic sensory feedback. Electromyographic recordings from residual arm muscles were decoded to provide independent and proportional control of a six-DOF prosthetic hand and wrist—the DEKA LUKE arm. Activation of contact sensors on the prosthesis resulted in intraneural microstimulation of residual sensory nerve fibers through chronically implanted Utah Slanted Electrode Arrays, thereby evoking tactile percepts on the phantom hand. With sensory feedback enabled, the participant exhibited greater precision in grip force and was better able to handle fragile objects. With active exploration, the participant was also able to distinguish between small and large objects and between soft and hard ones. When the sensory feedback was biomimetic—designed to mimic natural sensory signals—the participant was able to identify the objects significantly faster than with the use of traditional encoding algorithms that depended on only the present stimulus intensity. Thus, artificial touch can be sculpted by patterning the sensory feedback, and biologically inspired patterns elicit more interpretable and useful percepts. 
    more » « less
  4. null (Ed.)
    In this work, we investigate the influence that audio and visual feedback have on a manipulation task in virtual reality (VR). Without the tactile feedback of a controller, grasping virtual objects using one’s hands can result in slower interactions because it may be unclear to the user that a grasp has occurred. Providing alternative feedback, such as visual or audio cues, may lead to faster and more precise interactions, but might also affect user preference and perceived ownership of the virtual hands. In this study, we test four feedback conditions for virtual grasping. Three of the conditions provide feedback for when a grasp or release occurs, either visual, audio, or both, and one provides no feedback for these occurrences. We analyze the effect each feedback condition has on interaction performance, measure their effect on the perceived ownership of the virtual hands, and gauge user preference. In an experiment, users perform a pick-and-place task with each feedback condition. We found that audio feedback for grasping is preferred over visual feedback even though it seems to decrease grasping performance, and found that there were little to no differences in ownership between our conditions. 
    more » « less
  5. This work provides an architecture that incorporates depth and tactile information to create rich and accurate 3D models useful for robotic manipulation tasks. This is accomplished through the use of a 3D convolutional neural network (CNN). Offline, the network is provided with both depth and tactile information and trained to predict the object’s geometry, thus filling in regions of occlusion. At runtime, the network is provided a partial view of an object. Tactile information is acquired to augment the captured depth information. The network can then reason about the object’s geometry by utilizing both the collected tactile and depth information. We demonstrate that even small amounts of additional tactile information can be incredibly helpful in reasoning about object geometry. This is particularly true when information from depth alone fails to produce an accurate geometric prediction. Our method is benchmarked against and outperforms other visual-tactile approaches to general geometric reasoning. We also provide experimental results comparing grasping success with our method. 
    more » « less