Effective human-robot interaction is increasingly vital across various domains, including assistive robotics, emotional communication, entertainment, and industrial automation. Visual feedback, a common feature of current interfaces, may not be suitable for all environments. Audio feedback serves as a critical supplementary communication layer in settings where visibility is low or where robotic operations generate extensive data. Sonification, which transforms a robot's trajectory, motion, and environmental signals into sound, enhances users' comprehension of robot behavior. This improvement in understanding fosters more effective, safe, and reliable Human-Robot Interaction (HRI). Demonstrations of auditory data sonification's benefits are evident in real-world applications such as industrial assembly, robot-assisted rehabilitation, and interactive robotic exhibitions, where it promotes cooperation, boosts performance, and heightens engagement. Beyond conventional HRI environments, auditory data sonification shows substantial potential in managing complex robotic systems and intricate structures, such as hyper-redundant robots and robotic teams. These systems often challenge operators with complex joint monitoring, mathematical kinematic modeling, and visual behavior verification. This dissertation explores the sonification of motion in hyper-redundant robots and teams of industrial robots. It delves into the Wave Space Sonification (WSS) framework developed by Hermann, applying it to the motion datasets of protein molecules modeled as hyper-redundant mechanisms with numerous rigid nano-linkages. This research leverages the WSS framework to develop a sonification methodology for protein molecules' dihedral angle folding trajectories. Furthermore, it introduces a novel approach for the systematic sonification of robotic motion across varying configurations. By employing localized wave fields oriented within the robots' configuration space, this methodology generates auditory outputs with specific timbral qualities as robots move through predefined configurations or along certain trajectories. Additionally, the dissertation examines a team of wheeled industrial/service robots whose motion patterns are sonified using sinusoidal vibratory sounds, demonstrating the practical applications and benefits of this innovative approach.
more »
« less
RoboART: Artistic Robot Programming in Mixed Reality
Articulated robots are attracting the attention of artists worldwide. Due to their precise, tireless, and efficient nature, robots are now being deployed in different forms of creative expression, such as sculpting, choreography, immersive environments, and cinematography. While there is a growing interest among artists in robotics, programming such machines is a challenge for most professionals in the field, as robots require extensive coding experience and are primarily designed for industrial applications and environments. To enable artists to incorporate robots in their projects, we propose an end-user-friendly robot programming solution using an intuitive spatial computing environment designed for Microsoft Hololens 2. In our application, the robot movements are synchronized with a hologram via network communication. Using natural hand gestures, users can manipulate, animate, and record the hologram similar to 3D animation software, including the advantages of mixed reality interaction. Our solution not only gives artists the ability to translate their creative ideas and movements to an industrial machine but also makes human-robot interaction safer, as robots can now be accurately and effectively operated from a distance. We consider this an important step in a more human-driven robotics community, allowing creators without robot programming experience to easily script and perform complex sequences of robotic movement in service of new arts applications. Making robots more collaborative and safer for humans to interact with dramatically increases their utility, exposure, and potential for social interaction, opens new markets, expands creative industries, and directly locates them in highly visible public spaces.
more »
« less
- Award ID(s):
- 2024561
- PAR ID:
- 10566467
- Publisher / Repository:
- IEEE
- Date Published:
- ISBN:
- 979-8-3503-7449-0
- Page Range / eLocation ID:
- 1192 to 1193
- Format(s):
- Medium: X
- Location:
- Orlando, FL, USA
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Despite the inherent need for enhancing human-robot interaction (HRI) by non-visually communicating robotic movements and intentions, the application of sonification (the translation of data into audible information) within the field of robotics remains underexplored. This paper investigates the problem of designing sonification algorithms that translate the motion of teams of industrial mobile robots to non-speech sounds. Our proposed solution leverages the wave space sonification (WSS) framework and utilizes localized wave fields with specific orientations within the system configuration space. This WSS-based algorithm generates sounds from the motion data of mobile robots so that the resulting audio exhibits a chosen timbre when the robots pass near designated configurations or move along desired directions. To demonstrate its versatility, the WSS-based sonification algorithm is applied to a team of OMRON LD series autonomous mobile robots, sonifying their motion patterns with pure tonal sounds.more » « less
-
Wilde N.; Alonso-Mora J.; Brown D.; Mattson C.; Sycara K. (Ed.)In this paper, we introduce an innovative approach to multi-human robot interaction, leveraging the capabilities of omnicopters. These agile aerial vehicles are poised to revolutionize haptic feedback by offering complex sensations with 6 degrees of freedom (6DoF) movements. Unlike traditional systems, our envisioned method enables haptic rendering without the need for tilt, offering a more intuitive and seamless interaction experience. Furthermore, we propose using omnicopter swarms in human-robot interaction, these omnicopters can collaboratively emulate and render intricate objects in real-time. This swarm-based rendering not only expands the realm of tangible human-robot interactions but also holds potential in diverse applications, from immersive virtual environments to tactile guidance in physical tasks. Our vision outlines a future where robots and humans interact in more tangible and sophisticated ways, pushing the boundaries of current haptic technology.more » « less
-
An important component for the effective collaboration of humans with robots is the compatibility of their movements, especially when humans physically collaborate with a robot partner. Following previous findings that humans interact more seamlessly with a robot that moves with humanlike or biological velocity profiles, this study examined whether humans can adapt to a robot that violates human signatures. The specific focus was on the role of extensive practice and realtime augmented feedback. Six groups of participants physically tracked a robot tracing an ellipse with profiles where velocity scaled with the curvature of the path in biological and nonbiological ways, while instructed to minimize the interaction force with the robot. Three of the 6 groups received real-time visual feedback about their force error. Results showed that with 3 daily practice sessions, when given feedback about their force errors, humans could decrease their interaction forces when the robot’s trajectory violated human-like velocity patterns. Conversely, when augmented feedback was not provided, there were no improvements despite this extensive practice. The biological profile showed no improvements, even with feedback, indicating that the (non-zero) force had already reached a floor level. These findings highlight the importance of biological robot trajectories and augmented feedback to guide humans to adapt to non-biological movements in physical human-robot interaction. These results have implications on various fields of robotics, such as surgical applications and collaborative robots for industry.more » « less
-
Physical human–robot collaboration is becoming more common, both in industrial and service robotics. Cooperative execution of a task requires intuitive and efficient interaction between both actors. For humans, this means being able to predict and adapt to robot movements. Given that natural human movement exhibits several robust features, we examined whether human–robot physical interaction is facilitated when these features are considered in robot control. The present study investigated how humans adapt to biological and nonbiological velocity patterns in robot movements. Participants held the end-effector of a robot that traced an elliptic path with either biological (twothirds power law) or nonbiological velocity profiles. Participants were instructed to minimize the force applied on the robot endeffector. Results showed that the applied force was significantly lower when the robot moved with a biological velocity pattern. With extensive practice and enhanced feedback, participants were able to decrease their force when following a nonbiological velocity pattern, but never reached forces below those obtained with the 2/3 power law profile. These results suggest that some robust features observed in natural human movements are also a strong preference in guided movements. Therefore, such features should be considered in human–robot physical collaboration.more » « less
An official website of the United States government

