The human-robot interaction (HRI) field has rec- ognized the importance of enabling robots to interact with teams. Human teams rely on effective communication for suc- cessful collaboration in time-sensitive environments. Robots can play a role in enhancing team coordination through real-time assistance. Despite significant progress in human-robot teaming research, there remains an essential gap in how robots can effectively communicate with action teams using multimodal interaction cues in time-sensitive environments. This study addresses this knowledge gap in an experimental in-lab study to investigate how multimodal robot communication in action teams affects workload and human perception of robots. We explore team collaboration in a medical training scenario where a robotic crash cart (RCC) provides verbal and non-verbal cues to help users remember to perform iterative tasks and search for supplies. Our findings show that verbal cues for object search tasks and visual cues for task reminders reduce team workload and increase perceived ease of use and perceived usefulness more effectively than a robot with no feedback. Our work contributes to multimodal interaction research in the HRI field, highlighting the need for more human-robot teaming research to understand best practices for integrating collaborative robots in time-sensitive environments such as in hospitals, search and rescue, and manufacturing applications. 
                        more » 
                        « less   
                    
                            
                            A MultiModal Social Robot Toward Personalized Emotion Interaction
                        
                    
    
            Human emotions are expressed through multiple modalities, including verbal and non-verbal information. Moreover, the affective states of human users can be the indicator for the level of engagement and successful interaction, suitable for the robot to use as a rewarding factor to optimize robotic behaviors through interaction. This study demonstrates a multimodal human-robot interaction (HRI) framework with reinforcement learning to enhance the robotic interaction policy and personalize emotional interaction for a human user. The goal is to apply this framework in social scenarios that can let the robots generate a more natural and engaging HRI framework. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1846658
- PAR ID:
- 10316814
- Date Published:
- Journal Name:
- Artificial Intelligence for Human-Robot Interaction (AI-HRI) Fall Symposium
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Advances in robotics have contributed to the prevalence of human-robot collaboration (HRC). Working and interacting with collaborative robots in close proximity can be psychologically stressful. Therefore, it is important to understand the impacts of human-robot interaction (HRI) on mental stress to promote psychological well-being at the workplace. To this end, this study investigated how the HRI presence, complexity, and modality affect psychological stress in humans and discussed possible HRI design criteria during HRC. An experimental setup was implemented in which human operators worked with a collaborative robot on a Lego assembly task, using different interaction paradigms involving pressing buttons, showing hand gestures, and giving verbal commands. The NASA-Task Load Index, as a subjective measure, and the physiological galvanic skin conductance response, as an objective measure, were used to assess the levels of mental stress. The results revealed that the introduction of interactions during HRC helped reduce mental stress and that complex interactions resulted in higher mental stress than simple interactions. Meanwhile, the use of certain interaction modalities, such as verbal commands or hand gestures, led to significantly higher mental stress than pressing buttons, while no significant difference on mental stress was found between showing hand gestures and giving verbal commands.more » « less
- 
            null (Ed.)The study examines the relationship between the big five personality traits (extroversion, agreeableness, conscientiousness, neuroticism, and openness) and robot likeability and successful HRI implementation in varying human-robot interaction (HRI) situations. Further, this research investigates the influence of human-like attributes in robots (a.k.a. robotic anthropomorphism) on the likeability of robots. The research found that robotic anthropomorphism positively influences the relationship between human personality variables (e.g., extraversion and agreeableness) and robot likeability in human interaction with social robots. Further, anthropomorphism positively influences extraversion and robot likeability during industrial robotic interactions with humans. Extraversion, agreeableness, and neuroticism were found to play a significant role. This research bridges the gap by providing an in-depth understanding of the big five human personality traits, robotic anthropomorphism, and robot likeability in social-collaborative robotics.more » « less
- 
            Augmented Reality (AR) technologies present an exciting new medium for human-robot interactions, enabling new opportunities for both implicit and explicit human-robot communication. For example, these technologies enable physically-limited robots to execute non-verbal interaction patterns such as deictic gestures despite lacking the physical morphology necessary to do so. However, a wealth of HRI research has demonstrated real benefits to physical embodiment (compared to, e.g., virtual robots on screens), suggesting AR augmentation of virtual robot parts could face challenges.In this work, we present empirical evidence comparing the use of virtual (AR) and physical arms to perform deictic gestures that identify virtual or physical referents. Our subjective and objective results demonstrate the success of mixed reality deictic gestures in overcoming these potential limitations, and their successful use regardless of differences in physicality between gesture and referent. These results help to motivate the further deployment of mixed reality robotic systems and provide nuanced insight into the role of mixed-reality technologies in HRI contexts.more » « less
- 
            Effective human-robot interaction is increasingly vital across various domains, including assistive robotics, emotional communication, entertainment, and industrial automation. Visual feedback, a common feature of current interfaces, may not be suitable for all environments. Audio feedback serves as a critical supplementary communication layer in settings where visibility is low or where robotic operations generate extensive data. Sonification, which transforms a robot's trajectory, motion, and environmental signals into sound, enhances users' comprehension of robot behavior. This improvement in understanding fosters more effective, safe, and reliable Human-Robot Interaction (HRI). Demonstrations of auditory data sonification's benefits are evident in real-world applications such as industrial assembly, robot-assisted rehabilitation, and interactive robotic exhibitions, where it promotes cooperation, boosts performance, and heightens engagement. Beyond conventional HRI environments, auditory data sonification shows substantial potential in managing complex robotic systems and intricate structures, such as hyper-redundant robots and robotic teams. These systems often challenge operators with complex joint monitoring, mathematical kinematic modeling, and visual behavior verification. This dissertation explores the sonification of motion in hyper-redundant robots and teams of industrial robots. It delves into the Wave Space Sonification (WSS) framework developed by Hermann, applying it to the motion datasets of protein molecules modeled as hyper-redundant mechanisms with numerous rigid nano-linkages. This research leverages the WSS framework to develop a sonification methodology for protein molecules' dihedral angle folding trajectories. Furthermore, it introduces a novel approach for the systematic sonification of robotic motion across varying configurations. By employing localized wave fields oriented within the robots' configuration space, this methodology generates auditory outputs with specific timbral qualities as robots move through predefined configurations or along certain trajectories. Additionally, the dissertation examines a team of wheeled industrial/service robots whose motion patterns are sonified using sinusoidal vibratory sounds, demonstrating the practical applications and benefits of this innovative approach.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    