skip to main content


This content will become publicly available on May 3, 2024

Title: Embodied Communication: How Robots and People Communicate Through Physical Interaction
Early research on physical human–robot interaction (pHRI) has necessarily focused on device design—the creation of compliant and sensorized hardware, such as exoskeletons, prostheses, and robot arms, that enables people to safely come in contact with robotic systems and to communicate about their collaborative intent. As hardware capabilities have become sufficient for many applications, and as computing has become more powerful, algorithms that support fluent and expressive use of pHRI systems have begun to play a prominent role in determining the systems’ usefulness. In this review, we describe a selection of representative algorithmic approaches that regulate and interpret pHRI, describing the progression from algorithms based on physical analogies, such as admittance control, to computational methods based on higher-level reasoning, which take advantage of multimodal communication channels. Existing algorithmic approaches largely enable task-specific pHRI, but they do not generalize to versatile human–robot collaboration. Throughout the review and in our discussion of next steps, we therefore argue that emergent embodied dialogue—bidirectional, multimodal communication that can be learned through continuous interaction—is one of the next frontiers of pHRI.  more » « less
Award ID(s):
1837515
NSF-PAR ID:
10471783
Author(s) / Creator(s):
; ;
Publisher / Repository:
Annual Reviews
Date Published:
Journal Name:
Annual Review of Control, Robotics, and Autonomous Systems
Volume:
6
Issue:
1
ISSN:
2573-5144
Page Range / eLocation ID:
205 to 232
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. When a robot performs a task next to a human, physical interaction is inevitable: the human might push, pull, twist, or guide the robot. The state of the art treats these interactions as disturbances that the robot should reject or avoid. At best, these robots respond safely while the human interacts; but after the human lets go, these robots simply return to their original behavior. We recognize that physical human–robot interaction (pHRI) is often intentional: the human intervenes on purpose because the robot is not doing the task correctly. In this article, we argue that when pHRI is intentional it is also informative: the robot can leverage interactions to learn how it should complete the rest of its current task even after the person lets go. We formalize pHRI as a dynamical system, where the human has in mind an objective function they want the robot to optimize, but the robot does not get direct access to the parameters of this objective: they are internal to the human. Within our proposed framework human interactions become observations about the true objective. We introduce approximations to learn from and respond to pHRI in real-time. We recognize that not all human corrections are perfect: often users interact with the robot noisily, and so we improve the efficiency of robot learning from pHRI by reducing unintended learning. Finally, we conduct simulations and user studies on a robotic manipulator to compare our proposed approach with the state of the art. Our results indicate that learning from pHRI leads to better task performance and improved human satisfaction.

     
    more » « less
  2. Nonverbal interactions are a key component of human communication. Since robots have become significant by trying to get close to human beings, it is important that they follow social rules governing the use of space. Prior research has conceptualized personal space as physical zones which are based on static distances. This work examined how preferred interaction distance can change given different interaction scenarios. We conducted a user study using three different robot heights. We also examined the difference in preferred interaction distance when a robot approaches a human and, conversely, when a human approaches a robot. Factors included in quantitative analysis are the participants' gender, robot's height, and method of approach. Subjective measures included human comfort and perceived safety. The results obtained through this study shows that robot height, participant gender and method of approach were significant factors influencing measured proxemic zones and accordingly participant comfort. Subjective data showed that experiment respondents regarded robots in a more favorable light following their participation in this study. Furthermore, the NAO was perceived most positively by respondents according to various metrics and the PR2 Tall, most negatively. 
    more » « less
  3. Abstract Understanding the human motor control strategy during physical interaction tasks is crucial for developing future robots for physical human–robot interaction (pHRI). In physical human–human interaction (pHHI), small interaction forces are known to convey their intent between the partners for effective motor communication. The aim of this work is to investigate what affects the human’s sensitivity to the externally applied interaction forces. The hypothesis is that one way the small interaction forces are sensed is through the movement of the arm and the resulting proprioceptive signals. A pHRI setup was used to provide small interaction forces to the hand of seated participants in one of four directions, while the participants were asked to identify the direction of the push while blindfolded. The result shows that participants’ ability to correctly report the direction of the interaction force was lower with low interaction force as well as with high muscle contraction. The sensitivity to the interaction force direction increased with the radial displacement of the participant’s hand from the initial position: the further they moved the more correct their responses were. It was also observed that the estimated stiffness of the arm varies with the level of muscle contraction and robot interaction force. 
    more » « less
  4. Abstract

    Physical human–robot interactions (pHRI) often provide mechanical force and power to aid walking without requiring voluntary effort from the human. Alternatively, principles of physical human–human interactions (pHHI) can inspire pHRI that aids walking by engaging human sensorimotor processes. We hypothesize that low-force pHHI can intuitively induce a person to alter their walking through haptic communication. In our experiment, an expert partner dancer influenced novice participants to alter step frequency solely through hand interactions. Without prior instruction, training, or knowledge of the expert’s goal, novices decreased step frequency 29% and increased step frequency 18% based on low forces (< 20 N) at the hand. Power transfer at the hands was 3–700 × smaller than what is necessary to propel locomotion, suggesting that hand interactions did not mechanically constrain the novice’s gait. Instead, the sign/direction of hand forces and power may communicate information about how to alter walking. Finally, the expert modulated her arm effective dynamics to match that of each novice, suggesting a bidirectional haptic communication strategy for pHRI that adapts to the human. Our results provide a framework for developing pHRI at the hand that may be applicable to assistive technology and physical rehabilitation, human-robot manufacturing, physical education, and recreation.

     
    more » « less
  5. Abstract

    Effective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI.

     
    more » « less