skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on July 8, 2026

Title: NURing: A Tendon-Driven Wearable Ring for On-Demand Kinesthetic Haptic Feedback
Not AvailableGenerating salient and intuitively understood haptic feedback on the human finger through a non-intrusive wearable remains a challenge in haptic device development. Most existing solutions either restrict the hand and finger’s natural range of motion or impede sensory perception, quickly becoming intrusive during dexterous manipulation tasks. Here, we introduce NURing (Non-intrUsive Ring), a tendon-actuated haptic device that provides kinesthetic feedback by deflecting the finger. The NURing is easily donned and doffed, enabling on-demand kinesthetic feedback while leaving the hand and fingers free for dexterous tasks. We demonstrate that the device delivers perceptually salient feedback and evaluate its performance through a series of uniaxial motion guidance tasks. The lightweight NURing device, measuring approximately 220 g, can generate guidance cues at up to 1 Hz, enabling participants to identify target directions in under 3 s with a 1.5° steady-state error, corresponding to a fingertip deviation of less than 11mm. Additionally, it can guide users along complex, smooth trajectories with an average trajectory error of 7°. These findings highlight the effectiveness of fingertip deflection as a kinesthetic feedback modality, enabling precise guidance for real-world applications such as sightless touchscreen navigation, assistive technology, and both industrial and consumer augmented/virtual reality systems.  more » « less
Award ID(s):
2106191
PAR ID:
10652000
Author(s) / Creator(s):
 ;  ;  ;  ;  ;  
Publisher / Repository:
IEEE
Date Published:
Page Range / eLocation ID:
255 to 266
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. People use their hands for intricate tasks like playing musical instruments, employing myriad touch sensations to inform motor control. In contrast, current prosthetic hands lack comprehensive haptic feedback and exhibit rudimentary multitasking functionality. Limited research has explored the potential of upper limb amputees to feel, perceive, and respond to multiple channels of simultaneously activated haptic feedback to concurrently control the individual fingers of dexterous prosthetic hands. This study introduces a novel control architecture for three amputees and nine additional subjects to concurrently control individual fingers of an artificial hand using two channels of context-specific haptic feedback. Artificial neural networks (ANNs) recognize subjects’ electromyogram (EMG) patterns governing the artificial hand controller. ANNs also classify the directions objects slip across tactile sensors on the robotic fingertips, which are encoded via the vibration frequency of wearable vibrotactile actuators. Subjects implement control strategies with each finger simultaneously to prevent or permit slip as desired, achieving a 94.49% ± 8.79% overall success rate. Although no statistically significant difference exists between amputees’ and non-amputees’ success rates, amputees require more time to respond to simultaneous haptic feedback signals, suggesting a higher cognitive load. Nevertheless, amputees can accurately interpret multiple channels of nuanced haptic feedback to concurrently control individual robotic fingers, addressing the challenge of multitasking with dexterous prosthetic hands. 
    more » « less
  2. Handheld kinesthetic haptic interfaces can provide greater mobility and richer tactile information as compared to traditional grounded devices. In this paper, we introduce a new handheld haptic interface which takes input using bidirectional coupled finger flexion. We present the device design motivation and design details and experimentally evaluate its performance in terms of transparency and rendering bandwidth using a handheld prototype device. In addition, we assess the device's functional performance through a user study comparing the proposed device to a commonly used grounded input device in a set of targeting and tracking tasks. 
    more » « less
  3. Abstract This article describes the development and evaluation of our passively actuated closed-loop articulated wearable (CLAW) that uses a common slider to passively drive its exo-fingers for use in physical training of people with limited hand mobility. Our design approach utilizes physiological tasks for dimensional synthesis and yields a variety of design candidates that fulfill the desired fingertip precision grasping trajectory. Once it is ensured that the synthesized fingertip motion is close to the physiological fingertip grasping trajectories, performance assessment criteria related to user–device interference and natural joint angle movement are taken into account. After the most preferred design for each finger is chosen, minor modifications are made related to substituting the backbone chain with the wearer’s limb to provide the skeletal structure for the customized passive device. Subsequently, we evaluate it for natural joint motion based on a novel design candidate assessment method. A hand prototype is printed, and its preliminary performance regarding natural joint motion, wearability, and scalability are assessed. The pilot experimental test on a range of healthy subjects with different hand/finger sizes shows that the CLAW hand is easy to operate and guides the user’s fingers without causing any discomfort. It also ensures both precision and power grasping in a natural manner. This study establishes the importance of incorporating novel design candidate assessment techniques, based on human finger kinematic models, on a conceptual design level that can assist in finding design candidates for natural joint motion coordination. 
    more » « less
  4. Robot teleoperation is an emerging field of study with wide applications in exploration, manufacturing, and healthcare, because it allows users to perform complex remote tasks while remaining distanced and safe. Haptic feedback offers an immersive user experience and expands the range of tasks that can be accomplished through teleoperation. In this paper, we present a novel wearable haptic feedback device for a teleoperation system that applies kinesthetic force feedback to the fingers of a user. The proposed device, called a ‘haptic muscle’, is a soft pneumatic actuator constructed from a fabric-silicone composite in a toroidal structure. We explore the requirements of the ideal haptic feedback mechanism, construct several haptic muscles using different materials, and experimentally determine their dynamic pressure response as well as sensitivity (their ability to communicate small changes in haptic feedback). Finally, we integrate the haptic muscles into a data glove and a teleoperation system and perform several user tests. Our results show that most users could detect detect force changes as low as 3% of the working range of the haptic muscles. We also find that the haptic feedback causes users to apply up to 52% less force on an object while handling soft and fragile objects with a teleoperation system. 
    more » « less
  5. Objective: Robust neural decoding of intended motor output is crucial to enable intuitive control of assistive devices, such as robotic hands, to perform daily tasks. Few existing neural decoders can predict kinetic and kinematic variables simultaneously. The current study developed a continuous neural decoding approach that can concurrently predict fingertip forces and joint angles of multiple fingers. Methods: We obtained motoneuron firing activities by decomposing high-density electromyogram (HD EMG) signals of the extrinsic finger muscles. The identified motoneurons were first grouped and then refined specific to each finger (index or middle) and task (finger force and dynamic movement) combination. The refined motoneuron groups (separate matrix) were then applied directly to new EMG data in real-time involving both finger force and dynamic movement tasks produced by both fingers. EMG-amplitude-based prediction was also performed as a comparison. Results: We found that the newly developed decoding approach outperformed the EMG-amplitude method for both finger force and joint angle estimations with a lower prediction error (Force: 3.47±0.43 vs 6.64±0.69% MVC, Joint Angle: 5.40±0.50° vs 12.8±0.65°) and a higher correlation (Force: 0.75±0.02 vs 0.66±0.05, Joint Angle: 0.94±0.01 vs 0.5±0.05) between the estimated and recorded motor output. The performance was also consistent for both fingers. Conclusion: The developed neural decoding algorithm allowed us to accurately and concurrently predict finger forces and joint angles of multiple fingers in real-time. Significance: Our approach can enable intuitive interactions with assistive robotic hands, and allow the performance of dexterous hand skills involving both force control tasks and dynamic movement control tasks. 
    more » « less