Abstract Effective interactions between humans and robots are vital to achieving shared tasks in collaborative processes. Robots can utilize diverse communication channels to interact with humans, such as hearing, speech, sight, touch, and learning. Our focus, amidst the various means of interactions between humans and robots, is on three emerging frontiers that significantly impact the future directions of human–robot interaction (HRI): (i) human–robot collaboration inspired by human–human collaboration, (ii) brain-computer interfaces, and (iii) emotional intelligent perception. First, we explore advanced techniques for human–robot collaboration, covering a range of methods from compliance and performance-based approaches to synergistic and learning-based strategies, including learning from demonstration, active learning, and learning from complex tasks. Then, we examine innovative uses of brain-computer interfaces for enhancing HRI, with a focus on applications in rehabilitation, communication, brain state and emotion recognition. Finally, we investigate the emotional intelligence in robotics, focusing on translating human emotions to robots via facial expressions, body gestures, and eye-tracking for fluid, natural interactions. Recent developments in these emerging frontiers and their impact on HRI were detailed and discussed. We highlight contemporary trends and emerging advancements in the field. Ultimately, this paper underscores the necessity of a multimodal approach in developing systems capable of adaptive behavior and effective interaction between humans and robots, thus offering a thorough understanding of the diverse modalities essential for maximizing the potential of HRI.
more »
« less
Human mobile robot interaction in the retail environment
Abstract As technology advances, Human-Robot Interaction (HRI) is boosting overall system efficiency and productivity. However, allowing robots to be present closely with humans will inevitably put higher demands on precise human motion tracking and prediction. Datasets that contain both humans and robots operating in the shared space are receiving growing attention as they may facilitate a variety of robotics and human-systems research. Datasets that track HRI with rich information other than video images during daily activities are rarely seen. In this paper, we introduce a novel dataset that focuses on social navigation between humans and robots in a future-oriented Wholesale and Retail Trade (WRT) environment (https://uf-retail-cobot-dataset.github.io/). Eight participants performed the tasks that are commonly undertaken by consumers and retail workers. More than 260 minutes of data were collected, including robot and human trajectories, human full-body motion capture, eye gaze directions, and other contextual information. Comprehensive descriptions of each category of data stream, as well as potential use cases are included. Furthermore, analysis with multiple data sources and future directions are discussed.
more »
« less
- Award ID(s):
- 2132936
- PAR ID:
- 10378885
- Publisher / Repository:
- Nature Publishing Group
- Date Published:
- Journal Name:
- Scientific Data
- Volume:
- 9
- Issue:
- 1
- ISSN:
- 2052-4463
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Recent work in Human-Robot Interaction (HRI) has shown that robots can leverage implicit communicative signals from users to understand how they are being perceived during interactions. For example, these signals can be gaze patterns, facial expressions, or body motions that reflect internal human states. To facilitate future research in this direction, we contribute the REACT database, a collection of two datasets of human-robot interactions that display users’ natural reactions to robots during a collaborative game and a photography scenario. Further, we analyze the datasets to show that interaction history is an important factor that can influence human reactions to robots. As a result, we believe that future models for interpreting implicit feedback in HRI should explicitly account for this history. REACT opens up doors to this possibility in the future.more » « less
-
A wide range of studies in Human-Robot Interaction (HRI) has shown that robots can influence the social behavior of humans. This phenomenon is commonly explained by the Media Equation. Fundamental to this theory is the idea that when faced with technology (like robots), people perceive it as a social agent with thoughts and intentions similar to those of humans. This perception guides the interaction with the technology and its predicted impact. However, HRI studies have also reported examples in which the Media Equation has been violated, that is when people treat the influence of robots differently from the influence of humans. To address this gap, we propose a model of Robot Social Influence (RoSI) with two contributing factors. The first factor is a robot’s violation of a person’s expectations, whether the robot exceeds expectations or fails to meet expectations. The second factor is a person’s social belonging with the robot, whether the person belongs to the same group as the robot or a different group. These factors are primary predictors of robots’ social influence and commonly mediate the influence of other factors. We review HRI literature and show how RoSI can explain robots’ social influence in concrete HRI scenarios.more » « less
-
Mistakes, failures, and transgressions committed by a robot are inevitable as robots become more involved in our society. When a wrong behavior occurs, it is important to understand what factors might affect how the robot is perceived by people. In this paper, we investigated how the type of transgressor (human or robot) and type of backstory depicting the transgressor's mental capabilities (default, physio-emotional, socio-emotional, or cognitive) shaped participants' perceptions of the transgressor's morality. We performed an online, between-subjects study in which participants (N=720) were first introduced to the transgressor and its backstory, and then viewed a video of a real-life robot or human pushing down a human. Although participants attributed similarly high intent to both the robot and the human, the human was generally perceived to have higher morality than the robot. However, the backstory that was told about the transgressors' capabilities affected their perceived morality. We found that robots with emotional backstories (i.e., physio-emotional or socio-emotional) had higher perceived moral knowledge, emotional knowledge, and desire than other robots. We also found that humans with cognitive backstories were perceived with less emotional and moral knowledge than other humans. Our findings have consequences for robot ethics and robot design for HRI.more » « less
-
Abstract This paper introduces an innovative and streamlined design of a robot, resembling a bicycle, created to effectively inspect a wide range of ferromagnetic structures, even those with intricate shapes. The key highlight of this robot lies in its mechanical simplicity coupled with remarkable agility. The locomotion strategy hinges on the arrangement of two magnetic wheels in a configuration akin to a bicycle, augmented by two independent steering actuators. This configuration grants the robot the exceptional ability to move in multiple directions. Moreover, the robot employs a reciprocating mechanism that allows it to alter its shape, thereby surmounting obstacles effortlessly. An inherent trait of the robot is its innate adaptability to uneven and intricate surfaces on steel structures, facilitated by a dynamic joint. To underscore its practicality, the robot's application is demonstrated through the utilization of an ultrasonic sensor for gauging steel thickness, coupled with a pragmatic deployment mechanism. By integrating a defect detection model based on deep learning, the robot showcases its proficiency in automatically identifying and pinpointing areas of rust on steel surfaces. The paper undertakes a thorough analysis, encompassing robot kinematics, adhesive force, potential sliding and turn‐over scenarios, and motor power requirements. These analyses collectively validate the stability and robustness of the proposed design. Notably, the theoretical calculations established in this study serve as a valuable blueprint for developing future robots tailored for climbing steel structures. To enhance its inspection capabilities, the robot is equipped with a camera that employs deep learning algorithms to detect rust visually. The paper substantiates its claims with empirical evidence, sharing results from extensive experiments and real‐world deployments on diverse steel bridges, situated in both Nevada and Georgia. These tests comprehensively affirm the robot's proficiency in adhering to surfaces, navigating challenging terrains, and executing thorough inspections. A comprehensive visual representation of the robot's trials and field deployments is presented in videos accessible at the following links:https://youtu.be/Qdh1oz_oxiQ andhttps://youtu.be/vFFq79O49dM.more » « less