Regular user interface screens can display dense and detailed information to human users but miss out on providing somatosensory stimuli that take full advantage of human spatial cognition. Therefore, the development of new haptic displays can strengthen human-machine communication by augmenting visual communication with tactile stimulation needed to transform information from digital to spatial/physical environments. Shape-changing interfaces, such as pin arrays and robotic surfaces, are one method for providing this spatial dimension of feedback; however, these displays are often either limited in maximum extension or require bulky mechanical components. In this paper, we present a compact pneumatically actuated soft growing pin for inflatable haptic interfaces. Each pin consists of a rigid, air-tight chamber, an inflatable fabric pin, and a passive spring-actuated reel mechanism. The device behavior was experimentally characterized, showing extension to 18.5 cm with relatively low pressure input (1.75 psi, 12.01 kPa), and the behavior was compared to the mathematical model of soft growing robots. The results showed that the extension of the soft pin can be accurately modeled and controlled using pressure as input. Finally, we demonstrate the feasibility of implementing individually actuated soft growing pins to create an inflatable haptic surface.
more »
« less
Wrapped Haptic Display for Communicating Physical Robot Learning
Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot’s uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user to provide better demonstrations. See videos of our device and user studies here: https://youtu.be/tX-2Tqeb9Nw
more »
« less
- PAR ID:
- 10340739
- Date Published:
- Journal Name:
- 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft)
- Page Range / eLocation ID:
- 823 to 830
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Effective physical human-robot interaction (pHRI) depends on how humans can communicate their intentions for movement with others. While it is speculated that small interaction forces contain significant information to convey the specific movement intention of physical humanhuman interaction (pHHI), the underlying mechanism for humans to infer intention from such small forces is largely unknown. The hypothesis in this work is that the sensitivity to a small interaction force applied at the hand is affected by the movement of the arm that is affected by the arm stiffness. For this, a haptic robot was used to provide the endpoint interaction forces to the arm of seated human participants. They were asked to determine one of the four directions of the applied robot interaction force without visual feedback. Variations of levels of interaction force as well as arm muscle contraction were applied. The results imply that human’s ability to identify and respond to the correct direction of small interaction forces was lower when the alignment of human arm movement with respect to the force direction was higher. In addition, the sensitivity to the direction of the small interaction force was high when the arm stiffness was low. It is also speculated that humans lower their arm stiffness to be more sensitive to smaller interaction forces. These results will help develop human-like pHRI systems for various applications.more » « less
-
Robot teleoperation is an emerging field of study with wide applications in exploration, manufacturing, and healthcare, because it allows users to perform complex remote tasks while remaining distanced and safe. Haptic feedback offers an immersive user experience and expands the range of tasks that can be accomplished through teleoperation. In this paper, we present a novel wearable haptic feedback device for a teleoperation system that applies kinesthetic force feedback to the fingers of a user. The proposed device, called a ‘haptic muscle’, is a soft pneumatic actuator constructed from a fabric-silicone composite in a toroidal structure. We explore the requirements of the ideal haptic feedback mechanism, construct several haptic muscles using different materials, and experimentally determine their dynamic pressure response as well as sensitivity (their ability to communicate small changes in haptic feedback). Finally, we integrate the haptic muscles into a data glove and a teleoperation system and perform several user tests. Our results show that most users could detect detect force changes as low as 3% of the working range of the haptic muscles. We also find that the haptic feedback causes users to apply up to 52% less force on an object while handling soft and fragile objects with a teleoperation system.more » « less
-
For robots to seamlessly interact with humans, we first need to make sure that humans and robots understand one another. Diverse algorithms have been developed to enable robots to learn from humans (i.e., transferring information from humans to robots). In parallel, visual, haptic, and auditory communication interfaces have been designed to convey the robot’s internal state to the human (i.e., transferring information from robots to humans). Prior research often separates these two directions of information transfer, and focuses primarily on either learning algorithms or communication interfaces. By contrast, in this survey we take an interdisciplinary approach to identify common themes and emerging trends that close the loop between learning and communication. Specifically, we survey state-of-the-art methods and outcomes for communicating a robot’s learning back to the human teacher during human-robot interaction. This discussion connects human-in-the-loop learning methods and explainable robot learning with multimodal feedback systems and measures of human-robot interaction. We find that—when learning and communication are developed together—the resulting closed-loop system can lead to improved human teaching, increased human trust, and human-robot co-adaptation. The paper includes a perspective on several of the interdisciplinary research themes and open questions that could advance how future robots communicate their learning to everyday operators. Finally, we implement a selection of the reviewed methods in a case study where participants kinesthetically teach a robot arm. This case study documents and tests an integrated approach for learning in ways that can be communicated, conveying this learning across multimodal interfaces, and measuring the resulting changes in human and robot behavior.more » « less
-
We present V.Ra, a visual and spatial programming system for robot-IoT task authoring. In V.Ra, programmable mobile robots serve as binding agents to link the stationary IoTs and perform collaborative tasks. We establish an ecosystem that coherently connects the three key elements of robot task planning (human-robot-IoT) with one single AR-SLAM device. Users can perform task authoring in an analogous manner with the Augmented Reality (AR) interface. Then placing the device onto the mobile robot directly transfers the task plan in a what-you-do-is-what-robot-does (WYDWRD) manner. The mobile device mediates the interactions between the user, robot and IoT oriented tasks, and guides the path planning execution with the SLAM capability.more » « less