skip to main content

This content will become publicly available on April 4, 2023

Title: Wrapped Haptic Display for Communicating Physical Robot Learning
Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot’s uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user more » to provide better demonstrations. See videos of our device and user studies here: https://youtu.be/tX-2Tqeb9Nw « less
Authors:
; ; ; ; ;
Award ID(s):
2129155 2129201
Publication Date:
NSF-PAR ID:
10340739
Journal Name:
2022 IEEE 5th International Conference on Soft Robotics (RoboSoft)
Page Range or eLocation-ID:
823 to 830
Sponsoring Org:
National Science Foundation
More Like this
  1. Effective physical human-robot interaction (pHRI) depends on how humans can communicate their intentions for movement with others. While it is speculated that small interaction forces contain significant information to convey the specific movement intention of physical humanhuman interaction (pHHI), the underlying mechanism for humans to infer intention from such small forces is largely unknown. The hypothesis in this work is that the sensitivity to a small interaction force applied at the hand is affected by the movement of the arm that is affected by the arm stiffness. For this, a haptic robot was used to provide the endpoint interaction forcesmore »to the arm of seated human participants. They were asked to determine one of the four directions of the applied robot interaction force without visual feedback. Variations of levels of interaction force as well as arm muscle contraction were applied. The results imply that human’s ability to identify and respond to the correct direction of small interaction forces was lower when the alignment of human arm movement with respect to the force direction was higher. In addition, the sensitivity to the direction of the small interaction force was high when the arm stiffness was low. It is also speculated that humans lower their arm stiffness to be more sensitive to smaller interaction forces. These results will help develop human-like pHRI systems for various applications.« less
  2. An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulledmore »apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery.« less
  3. Over the past few decades, there have been many studies of human-human physical interaction to better understand why humans physically interact so effectively and how dyads outperform individuals in certain motor tasks. Because of the different methodologies and experimental setups in these studies, however, it is difficult to draw general conclusions as to the reasons for this improved performance. In this study, we propose an open-source experimental framework for the systematic study of the effect of human-human interaction, as mediated by robots, at the ankle joint. We also propose a new framework to study various interactive behaviors (i.e., collaborative, cooperative,more »and competitive tasks) that can be emulated using a virtual spring connecting human pairs. To validate the proposed experimental framework, we perform a transparency analysis, which is closely related to haptic rendering performance. We compare muscle EMG and ankle motion data while subjects are barefoot, attached to the unpowered robot, and attached to the powered robot implementing transparency control. We also validate the performance in rendering a virtual springs covering a range of stiffness values (5-50 Nm/rad) while the subjects track several desired trajectories(sine waves at frequencies between 0.1 and 1.1 Hz). Finally, we study the performance of the system in human-human interaction under nine different interactive conditions. Finally, we demonstrate the feasibility of the system in studying human-human interaction under different interactive behaviors.« less
  4. Principles from human-human physical interaction may be necessary to design more intuitive and seamless robotic devices to aid human movement. Previous studies have shown that light touch can aid balance and that haptic communication can improve performance of physical tasks, but the effects of touch between two humans on walking balance has not been previously characterized. This study examines physical interaction between two persons when one person aids another in performing a beam-walking task. 12 pairs of healthy young adults held a force sensor with one hand while one person walked on a narrow balance beam (2 cm wide x 3.7 mmore »long) and the other person walked overground by their side. We compare balance performance during partnered vs. solo beam-walking to examine the effects of haptic interaction, and we compare hand interaction mechanics during partnered beam-walking vs. overground walking to examine how the interaction aided balance. While holding the hand of a partner, participants were able to walk further on the beam without falling, reduce lateral sway, and decrease angular momentum in the frontal plane. We measured small hand force magnitudes (mean of 2.2 N laterally and 3.4 N vertically) that created opposing torque components about the beam axis and calculated the interaction torque, the overlapping opposing torque that does not contribute to motion of the beam-walker’s body. We found higher interaction torque magnitudes during partnered beam-walking vs . partnered overground walking, and correlation between interaction torque magnitude and reductions in lateral sway. To gain insight into feasible controller designs to emulate human-human physical interactions for aiding walking balance, we modeled the relationship between each torque component and motion of the beam-walker’s body as a mass-spring-damper system. Our model results show opposite types of mechanical elements (active vs . passive) for the two torque components. Our results demonstrate that hand interactions aid balance during partnered beam-walking by creating opposing torques that primarily serve haptic communication, and our model of the torques suggest control parameters for implementing human-human balance aid in human-robot interactions.« less
  5. A prerequisite for social coordination is bidirectional communication between teammates, each playing two roles simultaneously: as receptive listeners and expressive speakers. For robots working with humans in complex situations with multiple goals that differ in importance, failure to fulfill the expectation of either role could undermine group performance due to misalignment of values between humans and robots. Specifically, a robot needs to serve as an effective listener to infer human users’ intents from instructions and feedback and as an expressive speaker to explain its decision processes to users. Here, we investigate how to foster effective bidirectional human-robot communications in themore »context of value alignment—collaborative robots and users form an aligned understanding of the importance of possible task goals. We propose an explainable artificial intelligence (XAI) system in which a group of robots predicts users’ values by taking in situ feedback into consideration while communicating their decision processes to users through explanations. To learn from human feedback, our XAI system integrates a cooperative communication model for inferring human values associated with multiple desirable goals. To be interpretable to humans, the system simulates human mental dynamics and predicts optimal explanations using graphical models. We conducted psychological experiments to examine the core components of the proposed computational framework. Our results show that real-time human-robot mutual understanding in complex cooperative tasks is achievable with a learning model based on bidirectional communication. We believe that this interaction framework can shed light on bidirectional value alignment in communicative XAI systems and, more broadly, in future human-machine teaming systems.« less