skip to main content


The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 5:00 PM ET until 11:00 PM ET on Friday, June 21 due to maintenance. We apologize for the inconvenience.

Title: Wrapped Haptic Display for Communicating Physical Robot Learning
Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot’s uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user to provide better demonstrations. See videos of our device and user studies here:  more » « less
Award ID(s):
2129155 2129201
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
2022 IEEE 5th International Conference on Soft Robotics (RoboSoft)
Page Range / eLocation ID:
823 to 830
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Effective physical human-robot interaction (pHRI) depends on how humans can communicate their intentions for movement with others. While it is speculated that small interaction forces contain significant information to convey the specific movement intention of physical humanhuman interaction (pHHI), the underlying mechanism for humans to infer intention from such small forces is largely unknown. The hypothesis in this work is that the sensitivity to a small interaction force applied at the hand is affected by the movement of the arm that is affected by the arm stiffness. For this, a haptic robot was used to provide the endpoint interaction forces to the arm of seated human participants. They were asked to determine one of the four directions of the applied robot interaction force without visual feedback. Variations of levels of interaction force as well as arm muscle contraction were applied. The results imply that human’s ability to identify and respond to the correct direction of small interaction forces was lower when the alignment of human arm movement with respect to the force direction was higher. In addition, the sensitivity to the direction of the small interaction force was high when the arm stiffness was low. It is also speculated that humans lower their arm stiffness to be more sensitive to smaller interaction forces. These results will help develop human-like pHRI systems for various applications. 
    more » « less
  2. Robot teleoperation is an emerging field of study with wide applications in exploration, manufacturing, and healthcare, because it allows users to perform complex remote tasks while remaining distanced and safe. Haptic feedback offers an immersive user experience and expands the range of tasks that can be accomplished through teleoperation. In this paper, we present a novel wearable haptic feedback device for a teleoperation system that applies kinesthetic force feedback to the fingers of a user. The proposed device, called a ‘haptic muscle’, is a soft pneumatic actuator constructed from a fabric-silicone composite in a toroidal structure. We explore the requirements of the ideal haptic feedback mechanism, construct several haptic muscles using different materials, and experimentally determine their dynamic pressure response as well as sensitivity (their ability to communicate small changes in haptic feedback). Finally, we integrate the haptic muscles into a data glove and a teleoperation system and perform several user tests. Our results show that most users could detect detect force changes as low as 3% of the working range of the haptic muscles. We also find that the haptic feedback causes users to apply up to 52% less force on an object while handling soft and fragile objects with a teleoperation system. 
    more » « less
  3. null (Ed.)
    An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery. 
    more » « less
  4. While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR. 
    more » « less
  5. null (Ed.)
    Over the past few decades, there have been many studies of human-human physical interaction to better understand why humans physically interact so effectively and how dyads outperform individuals in certain motor tasks. Because of the different methodologies and experimental setups in these studies, however, it is difficult to draw general conclusions as to the reasons for this improved performance. In this study, we propose an open-source experimental framework for the systematic study of the effect of human-human interaction, as mediated by robots, at the ankle joint. We also propose a new framework to study various interactive behaviors (i.e., collaborative, cooperative, and competitive tasks) that can be emulated using a virtual spring connecting human pairs. To validate the proposed experimental framework, we perform a transparency analysis, which is closely related to haptic rendering performance. We compare muscle EMG and ankle motion data while subjects are barefoot, attached to the unpowered robot, and attached to the powered robot implementing transparency control. We also validate the performance in rendering a virtual springs covering a range of stiffness values (5-50 Nm/rad) while the subjects track several desired trajectories(sine waves at frequencies between 0.1 and 1.1 Hz). Finally, we study the performance of the system in human-human interaction under nine different interactive conditions. Finally, we demonstrate the feasibility of the system in studying human-human interaction under different interactive behaviors. 
    more » « less