skip to main content

Title: The Impact of an In-Home Co-Located Robotic Coach in Helping People Make Fewer Exercise Mistakes
Regular exercise provides many mental and physical health benefits. However, when exercises are done incorrectly, it can lead to injuries. Because the COVID-19 pandemic made it challenging to exercise in communal spaces, the growth of virtual fitness programs was accelerated, putting people at risk of sustaining exercise-related injuries as they received little to no feedback on their exercising techniques. Colocated robots could be one potential enhancement to virtual training programs as they can cause higher learning gains, more compliance, and more enjoyment than non-co-located robots. In this study, we compare the effects of a physically present robot by having a person exercise either with a robot (robot condition) or a video of a robot displayed on a tablet (tablet condition). Participants (N=25) had an exercise system in their homes for two weeks. Participants who exercised with the colocated robot made fewer mistakes than those who exercised with the video-displayed robot. Furthermore, participants in the robot condition reported a higher fitness increase and more motivation to exercise than participants in the tablet condition.  more » « less
Award ID(s):
1955653 1928448 2106690 1813651
Author(s) / Creator(s):
; ; ;
Date Published:
Journal Name:
31st IEEE International Conference on Robot & Human Interactive Communication
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null (Ed.)
    This paper presents preliminary research on whether children will accept a robot as part of their ingroup, and on how a robot's group membership affects trust, closeness, and social support. Trust is important in human-robot interactions because it affects if people will follow robots' advice. In this study, we randomly assigned 11- and 12-year-old participants to a condition such that participants were either on a team with the robot (ingroup) or were opponents of the robot (outgroup) for an online game. Thus far, we have eight participants in the ingroup condition. Our preliminary results showed that children had a low level of trust, closeness, and social support with the robot. Participants had a much more negative response than we anticipated. We speculate that there will be a more positive response with an in-person setting rather than a remote one. 
    more » « less
  2. Human-robot interactions that involve multiple robots are becoming common. It is crucial to understand how multiple robots should transfer information and transition users between them. To investigate this, we designed a 3 x 3 mixed design study in which participants took part in a navigation task. Participants interacted with a stationary robot who summoned a functional (not explicitly social) mobile robot to guide them. Each participant experienced the three types of robot-robot interaction: representative (the stationary robot spoke to the participant on behalf of the mobile robot), direct (the stationary robot delivered the request to the mobile robot in a straightforward manner), and social (the stationary robot delivered the request to the mobile robot in a social manner). Each participant witnessed only one type of robot-robot communication: silent (the robots covertly communicated), explicit (the robots acknowledged that they were communicating), or reciting (the stationary robot said the request aloud). Our results show that it is possible to instill socialness in and improve likability of a functional robot by having a social robot interact socially with it. We also found that covertly exchanging information is less desirable than reciting information aloud. 
    more » « less
  3. Lovable robots in movies regularly beep, chirp, and whirr, yet robots in the real world rarely deploy such sounds. Despite preliminary work supporting the perceptual and objective benefits of intentionally-produced robot sound, relatively little research is ongoing in this area. In this paper, we systematically evaluate transformative robot sound across multiple robot archetypes and behaviors. We conducted a series of five online video-based surveys, each with N ≈ 100 participants, to better understand the effects of musician-designed transformative sounds on perceptions of personal, service, and industrial robots. Participants rated robot videos with transformative sound as significantly happier, warmer, and more competent in all five studies, as more energetic in four studies, and as less discomforting in one study. Overall, results confirmed that transformative sounds consistently improve subjective ratings but may convey affect contrary to the intent of affective robot behaviors. In future work, we will investigate the repeatability of these results through in-person studies and develop methods to automatically generate transformative robot sound. This work may benefit researchers and designers who aim to make robots more favorable to human users. 
    more » « less
  4. This study evaluated how a robot demonstrating a Theory of Mind (ToM) influenced human perception of social intelligence and animacy in a human-robot interaction. Data was gathered through an online survey where participants watched a video depicting a NAO robot either failing or passing the Sally-Anne false-belief task. Participants (N = 60) were randomly assigned to either the Pass or Fail condition. A Perceived Social Intelligence Survey and the Perceived Intelligence and Animacy subsections of the Godspeed Questionnaire were used as measures. The Godspeed was given before viewing the task to measure participant expectations, and again after to test changes in opinion. Our findings show that robots demonstrating ToM significantly increase perceived social intelligence, while robots demonstrating ToM deficiencies are perceived as less socially intelligent. 
    more » « less
  5. As the influence of social robots in people’s daily lives grows, research on understanding people’s perception of robots including sociability, trust, acceptance, and preference becomes more pervasive. Research has considered visual, vocal, or tactile cues to express robots’ emotions, whereas little research has provided a holistic view in examining the interactions among different factors influencing emotion perception. We investigated multiple facets of user perception on robots during a conversational task by varying the robots’ voice types, appearances, and emotions. In our experiment, 20 participants interacted with two robots having four different voice types. While participants were reading fairy tales to the robot, the robot gave vocal feedback with seven emotions and the participants evaluated the robot’s profiles through post surveys. The results indicate that (1) the accuracy of emotion perception differed depending on presented emotions, (2) a regular human voice showed higher user preferences and naturalness, (3) but a characterized voice was more appropriate for expressing emotions with significantly higher accuracy in emotion perception, and (4) participants showed significantly higher emotion recognition accuracy with the animal robot than the humanoid robot. A follow-up study ([Formula: see text]) with voice-only conditions confirmed that the importance of embodiment. The results from this study could provide the guidelines needed to design social robots that consider emotional aspects in conversations between robots and users. 
    more » « less