- Award ID(s):
- 1849417
- PAR ID:
- 10277421
- Date Published:
- Journal Name:
- IEEE Transactions on Biomedical Circuits and Systems
- Page Range / eLocation ID:
- 981 to 984
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
null (Ed.)The compliant nature of soft fingers allows for safe and dexterous manipulation of objects by humans in an unstructured environment. A soft prosthetic finger design with tactile sensing capabilities for texture discrimination and subsequent sensory stimulation has the potential to create a more natural experience for an amputee. In this work, a pneumatically actuated soft biomimetic finger is integrated with a textile neuromorphic tactile sensor array for a texture discrimination task. The tactile sensor outputs were converted into neuromorphic spike trains, which emulate the firing pattern of biological mechanoreceptors. Spike-based features from each taxel compressed the information and were then used as inputs for the support vector machine classifier to differentiate the textures. Our soft biomimetic finger with neuromorphic encoding was able to achieve an average overall classification accuracy of 99.57% over 16 independent parameters when tested on 13 standardized textured surfaces. The 16 parameters were the combination of 4 angles of flexion of the soft finger and 4 speeds of palpation. To aid in the perception of more natural objects and their manipulation, subjects were provided with transcutaneous electrical nerve stimulation to convey a subset of four textures with varied textural information. Three able-bodied subjects successfully distinguished two or three textures with the applied stimuli. This work paves the way for a more human-like prosthesis through a soft biomimetic finger with texture discrimination capabilities using neuromorphic techniques that provide sensory feedback; furthermore, texture feedback has the potential to enhance user experience when interacting with their surroundings.more » « less
-
Abstract The mechanoreceptors of the human tactile sensory system contribute to natural grasping manipulations in everyday life. However, in the case of robot systems, attempts to emulate humans’ dexterity are still limited by tactile sensory feedback. In this work, a soft optical lightguide is applied as an afferent nerve fiber in a tactile sensory system. A skin‐like soft silicone material is combined with a bristle friction model, which is capable of fast and easy fabrication. Due to this novel design, the soft sensor can provide not only normal force (up to 5 Newtons) but also lateral force information generated by stick‐slip processes. Through a static force test and slip motion test, its ability to measure normal forces and to detect stick‐slip events is demonstrated. Finally, using a robotic gripper, real‐time control applications are investigated where the sensor helps the gripper apply sufficient force to grasp objects without slipping.
-
We describe use of a bidirectional neuromyoelectric prosthetic hand that conveys biomimetic sensory feedback. Electromyographic recordings from residual arm muscles were decoded to provide independent and proportional control of a six-DOF prosthetic hand and wrist—the DEKA LUKE arm. Activation of contact sensors on the prosthesis resulted in intraneural microstimulation of residual sensory nerve fibers through chronically implanted Utah Slanted Electrode Arrays, thereby evoking tactile percepts on the phantom hand. With sensory feedback enabled, the participant exhibited greater precision in grip force and was better able to handle fragile objects. With active exploration, the participant was also able to distinguish between small and large objects and between soft and hard ones. When the sensory feedback was biomimetic—designed to mimic natural sensory signals—the participant was able to identify the objects significantly faster than with the use of traditional encoding algorithms that depended on only the present stimulus intensity. Thus, artificial touch can be sculpted by patterning the sensory feedback, and biologically inspired patterns elicit more interpretable and useful percepts.more » « less
-
Abstract Sensory feedback is critical in fine motor control, learning, and adaptation. However, robotic prosthetic limbs currently lack the feedback segment of the communication loop between user and device. Sensory substitution feedback can close this gap, but sometimes this improvement only persists when users cannot see their prosthesis, suggesting the provided feedback is redundant with vision. Thus, given the choice, users rely on vision over artificial feedback. To effectively augment vision, sensory feedback must provide information that vision cannot provide or provides poorly. Although vision is known to be less precise at estimating speed than position, no work has compared speed precision of biomimetic arm movements. In this study, we investigated the uncertainty of visual speed estimates as defined by different virtual arm movements. We found that uncertainty was greatest for visual estimates of joint speeds, compared to absolute rotational or linear endpoint speeds. Furthermore, this uncertainty increased when the joint reference frame speed varied over time, potentially caused by an overestimation of joint speed. Finally, we demonstrate a joint-based sensory substitution feedback paradigm capable of significantly reducing joint speed uncertainty when paired with vision. Ultimately, this work may lead to improved prosthesis control and capacity for motor learning.
-
null (Ed.)In this work, we investigate the influence that audio and visual feedback have on a manipulation task in virtual reality (VR). Without the tactile feedback of a controller, grasping virtual objects using one’s hands can result in slower interactions because it may be unclear to the user that a grasp has occurred. Providing alternative feedback, such as visual or audio cues, may lead to faster and more precise interactions, but might also affect user preference and perceived ownership of the virtual hands. In this study, we test four feedback conditions for virtual grasping. Three of the conditions provide feedback for when a grasp or release occurs, either visual, audio, or both, and one provides no feedback for these occurrences. We analyze the effect each feedback condition has on interaction performance, measure their effect on the perceived ownership of the virtual hands, and gauge user preference. In an experiment, users perform a pick-and-place task with each feedback condition. We found that audio feedback for grasping is preferred over visual feedback even though it seems to decrease grasping performance, and found that there were little to no differences in ownership between our conditions.more » « less