skip to main content

Attention:

The NSF Public Access Repository (NSF-PAR) system and access will be unavailable from 11:00 PM ET on Thursday, May 23 until 2:00 AM ET on Friday, May 24 due to maintenance. We apologize for the inconvenience.


Title: A MultiModal Social Robot Toward Personalized Emotion Interaction
Human emotions are expressed through multiple modalities, including verbal and non-verbal information. Moreover, the affective states of human users can be the indicator for the level of engagement and successful interaction, suitable for the robot to use as a rewarding factor to optimize robotic behaviors through interaction. This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy and personalize emotional interaction for a human user. The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework.  more » « less
Award ID(s):
1846658
NSF-PAR ID:
10316814
Author(s) / Creator(s):
;
Date Published:
Journal Name:
Artificial Intelligence for Human-Robot Interaction (AI-HRI) Fall Symposium
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Augmented Reality (AR) technologies present an exciting new medium for human-robot interactions, enabling new opportunities for both implicit and explicit human-robot communication. For example, these technologies enable physically-limited robots to execute non-verbal interaction patterns such as deictic gestures despite lacking the physical morphology necessary to do so. However, a wealth of HRI research has demonstrated real benefits to physical embodiment (compared to, e.g., virtual robots on screens), suggesting AR augmentation of virtual robot parts could face challenges.In this work, we present empirical evidence comparing the use of virtual (AR) and physical arms to perform deictic gestures that identify virtual or physical referents. Our subjective and objective results demonstrate the success of mixed reality deictic gestures in overcoming these potential limitations, and their successful use regardless of differences in physicality between gesture and referent. These results help to motivate the further deployment of mixed reality robotic systems and provide nuanced insight into the role of mixed-reality technologies in HRI contexts. 
    more » « less
  2. Advances in robotics have contributed to the prevalence of human-robot collaboration (HRC). Working and interacting with collaborative robots in close proximity can be psychologically stressful. Therefore, it is important to understand the impacts of human-robot interaction (HRI) on mental stress to promote psychological well-being at the workplace. To this end, this study investigated how the HRI presence, complexity, and modality affect psychological stress in humans and discussed possible HRI design criteria during HRC. An experimental setup was implemented in which human operators worked with a collaborative robot on a Lego assembly task, using different interaction paradigms involving pressing buttons, showing hand gestures, and giving verbal commands. The NASA-Task Load Index, as a subjective measure, and the physiological galvanic skin conductance response, as an objective measure, were used to assess the levels of mental stress. The results revealed that the introduction of interactions during HRC helped reduce mental stress and that complex interactions resulted in higher mental stress than simple interactions. Meanwhile, the use of certain interaction modalities, such as verbal commands or hand gestures, led to significantly higher mental stress than pressing buttons, while no significant difference on mental stress was found between showing hand gestures and giving verbal commands.

     
    more » « less
  3. Users play an integral role in the performance of many robotic systems, and robotic systems must account for differences in users to improve collaborative performance. Much of the work in adapting to users has focused on designing teleoperation controllers that adjust to extrinsic user indicators such as force, or intent, but do not adjust to intrinsic user qualities. In contrast, the Human-Robot Interaction community has extensively studied intrinsic user qualities, but results may not rapidly be fed back into autonomy design. Here we provide foundational evidence for a new strategy that augments current shared control, and provide a mechanism to directly feed back results from the HRI community into autonomy design. Our evidence is based on a study examining the impact of the user quality “locus of control” on telepresence robot performance. Our results support our hypothesis that key user qualities can be inferred from human-robot interactions (such as through path deviation or time to completion) and that switching or adaptive autonomies might improve shared control performance. 
    more » « less
  4. In this work, we present Robots for Social Justice (R4SJ): a framework for an equitable engineering practice of Human-Robot Interaction, grounded in the Engineering for Social Justice (E4SJ) framework for Engineering Education and intended to complement existing frameworks for guiding equitable HRI research. To understand the new insights this framework could provide to the field of HRI, we analyze the past decade of papers published at the ACM/IEEE International Conference on Human-Robot Interaction, and examine how well current HRI research aligns with the principles espoused in the E4SJ framework. Based on the gaps identified through this analysis, we make five concrete recommendations, and highlight key questions that can guide the introspection for engineers, designers, and researchers. We believe these considerations are a necessary step not only to ensure that our engineering education efforts encourage students to engage in equitable and societally beneficial engineering practices (the purpose of E4SJ), but also to ensure that the technical advances we present at conferences like HRI promise true advances to society, and not just to fellow researchers and engineers. 
    more » « less
  5. This paper presents an intensive case study of 10 participants in the US and South Korea interacting with a robotic companion pet in their own homes over the course of several weeks. Participants were tracked every second of every day during that period of time. The fundamental goal was to determine whether there were significant differences in the types of interactions that occurred across those cultural settings, and how those differences affected modeling of the human-robot interactions. We collected a mix of quantitative and qualitative data through sensors onboard the robot, ecological momentary assessment (EMA), and participant interviews. Results showed that there were significant differences in how participants in Korea interacted with the robotic pet relative to participants in the US, which impacted machine learning and deep learning models of the interactions. Moreover, those differences were connected to differences in participant perceptions of the robot based on the qualitative interviews. The work here suggests that it may be necessary to develop culturally-specific models and/or sensor suites for human-robot interaction (HRI) in the future, and that simply adapting the same robot's behavior through cultural homophily may be insufficient. 
    more » « less