In Human–Robot Interaction, researchers typically utilize in-person studies to collect subjective perceptions of a robot. In addition, videos of interactions and interactive simulations (where participants control an avatar that interacts with a robot in a virtual world) have been used to quickly collect human feedback at scale. How would human perceptions of robots compare between these methodologies? To investigate this question, we conducted a 2x2 between-subjects study (N=160), which evaluated the effect of the interaction environment (Real vs. Simulated environment) and participants’ interactivity during human-robot encounters (Interactive participation vs. Video observations) on perceptions about a robot (competence, discomfort, social presentation, and social information processing) for the task of navigating in concert with people. We also studied participants’ workload across the experimental conditions. Our results revealed a significant difference in the perceptions of the robot between the real environment and the simulated environment. Furthermore, our results showed differences in human perceptions when people watched a video of an encounter versus taking part in the encounter. Finally, we found that simulated interactions and videos of the simulated encounter resulted in a higher workload than real-world encounters and videos thereof. Our results suggest that findings from video and simulation methodologies may not always translate to real-world human–robot interactions. In order to allow practitioners to leverage learnings from this study and future researchers to expand our knowledge in this area, we provide guidelines for weighing the tradeoffs between different methodologies.
more »
« less
A Framework for Dyadic Physical Interaction Studies during Ankle Motor Tasks
Over the past few decades, there have been many studies of human-human physical interaction to better understand why humans physically interact so effectively and how dyads outperform individuals in certain motor tasks. Because of the different methodologies and experimental setups in these studies, however, it is difficult to draw general conclusions as to the reasons for this improved performance. In this study, we propose an open-source experimental framework for the systematic study of the effect of human-human interaction, as mediated by robots, at the ankle joint. We also propose a new framework to study various interactive behaviors (i.e., collaborative, cooperative, and competitive tasks) that can be emulated using a virtual spring connecting human pairs. To validate the proposed experimental framework, we perform a transparency analysis, which is closely related to haptic rendering performance. We compare muscle EMG and ankle motion data while subjects are barefoot, attached to the unpowered robot, and attached to the powered robot implementing transparency control. We also validate the performance in rendering a virtual springs covering a range of stiffness values (5-50 Nm/rad) while the subjects track several desired trajectories(sine waves at frequencies between 0.1 and 1.1 Hz). Finally, we study the performance of the system in human-human interaction under nine different interactive conditions. Finally, we demonstrate the feasibility of the system in studying human-human interaction under different interactive behaviors.
more »
« less
- Award ID(s):
- 2024488
- PAR ID:
- 10276956
- Date Published:
- Journal Name:
- IEEE robotics automation letters
- ISSN:
- 2377-3766
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Wilde N.; Alonso-Mora J.; Brown D.; Mattson C.; Sycara K. (Ed.)In this paper, we introduce an innovative approach to multi-human robot interaction, leveraging the capabilities of omnicopters. These agile aerial vehicles are poised to revolutionize haptic feedback by offering complex sensations with 6 degrees of freedom (6DoF) movements. Unlike traditional systems, our envisioned method enables haptic rendering without the need for tilt, offering a more intuitive and seamless interaction experience. Furthermore, we propose using omnicopter swarms in human-robot interaction, these omnicopters can collaboratively emulate and render intricate objects in real-time. This swarm-based rendering not only expands the realm of tangible human-robot interactions but also holds potential in diverse applications, from immersive virtual environments to tactile guidance in physical tasks. Our vision outlines a future where robots and humans interact in more tangible and sophisticated ways, pushing the boundaries of current haptic technology.more » « less
-
Rehabilitation of human motor function is an issue of growing significance, and human-interactive robots offer promising potential to meet the need. For the lower extremity, however, robot-aided therapy has proven challenging. To inform effective approaches to robotic gait therapy, it is important to better understand unimpaired locomotor control: its sensitivity to different mechanical contexts and its response to perturbations. The present study evaluated the behavior of 14 healthy subjects who walked on a motorized treadmill and overground while wearing an exoskeletal ankle robot. Their response to a periodic series of ankle plantar flexion torque pulses, delivered at periods different from, but sufficiently close to, their preferred stride cadence, was assessed to determine whether gait entrainment occurred, how it differed across conditions, and if the adapted motor behavior persisted after perturbation. Certain aspects of locomotor control were exquisitely sensitive to walking context, while others were not. Gaits entrained more often and more rapidly during overground walking, yet, in all cases, entrained gaits synchronized the torque pulses with ankle push-off, where they provided assistance with propulsion. Furthermore, subjects entrained to perturbation periods that required an adaption toward slower cadence, even though the pulses acted to accelerate gait, indicating a neural adaptation of locomotor control. Lastly, during 15 post-perturbation strides, the entrained gait period was observed to persist more frequently during overground walking. This persistence was correlated with the number of strides walked at the entrained gait period (i.e., longer exposure), which also indicated a neural adaptation. NEW & NOTEWORTHY We show that the response of human locomotion to physical interaction differs between treadmill and overground walking. Subjects entrained to a periodic series of ankle plantar flexion torque pulses that shifted their gait cadence, synchronizing ankle push-off with the pulses (so that they assisted propulsion) even when gait cadence slowed. Entrainment was faster overground and, on removal of torque pulses, the entrained gait period persisted more prominently overground, indicating a neural adaptation of locomotor control.more » « less
-
null (Ed.)An important problem in designing human-robot systems is the integration of human intent and performance in the robotic control loop, especially in complex tasks. Bimanual coordination is a complex human behavior that is critical in many fine motor tasks, including robot-assisted surgery. To fully leverage the capabilities of the robot as an intelligent and assistive agent, online recognition of bimanual coordination could be important. Robotic assistance for a suturing task, for example, will be fundamentally different during phases when the suture is wrapped around the instrument (i.e., making a c- loop), than when the ends of the suture are pulled apart. In this study, we develop an online recognition method of bimanual coordination modes (i.e., the directions and symmetries of right and left hand movements) using geometric descriptors of hand motion. We (1) develop this framework based on ideal trajectories obtained during virtual 2D bimanual path following tasks performed by human subjects operating Geomagic Touch haptic devices, (2) test the offline recognition accuracy of bi- manual direction and symmetry from human subject movement trials, and (3) evalaute how the framework can be used to characterize 3D trajectories of the da Vinci Surgical System’s surgeon-side manipulators during bimanual surgical training tasks. In the human subject trials, our geometric bimanual movement classification accuracy was 92.3% for movement direction (i.e., hands moving together, parallel, or away) and 86.0% for symmetry (e.g., mirror or point symmetry). We also show that this approach can be used for online classification of different bimanual coordination modes during needle transfer, making a C loop, and suture pulling gestures on the da Vinci system, with results matching the expected modes. Finally, we discuss how these online estimates are sensitive to task environment factors and surgeon expertise, and thus inspire future work that could leverage adaptive control strategies to enhance user skill during robot-assisted surgery.more » « less
-
null (Ed.)The primary goal of an assist-as-needed (AAN) controller is to maximize subjects' active participation during motor training tasks while allowing moderate tracking errors to encourage human learning of a target movement. Impedance control is typically employed by AAN controllers to create a compliant force-field around the desired motion trajectory. To accommodate different individuals with varying motor abilities, most of the existing AAN controllers require extensive manual tuning of the control parameters, resulting in a tedious and time-consuming process. In this paper, we propose a reinforcement learning AAN controller that can autonomously reshape the force-field in real-time based on subjects' training performances. The use of action-dependent heuristic dynamic programming enables a model-free implementation of the proposed controller. To experimentally validate the controller, a group of healthy individuals participated in a gait training session wherein they were asked to learn a modified gait pattern with the help of a powered ankle-foot orthosis. Results indicated the potential of the proposed control strategy for robot-assisted gait training.more » « less
An official website of the United States government

