Despite promises about the near-term potential of social robots to share our daily lives, they remain unable to form autonomous, lasting, and engaging relationships with humans. Many companies are deploying social robots into the consumer and commercial market; however, both the companies and their products are relatively short lived for many reasons. For example, current social robots succeed in interacting with humans only within controlled environments, such as research labs, and for short time periods since longer interactions tend to provoke user disengagement. We interviewed 13 roboticists from robot manufacturing companies and research labs to delve deeper into the design process for social robots and unearth the many challenges robot creators face. Our research questions were: 1) What are the different design processes for creating social robots? 2) How are users involved in the design of social robots? 3) How are teams of robot creators constituted? Our qualitative investigation showed that varied design practices are applied when creating social robots but no consensus exists about an optimal or standard one. Results revealed that users have different degrees of involvement in the robot creation process, from no involvement to being a central part of robot development. Results also uncovered the need for multidisciplinary and international teams to work together to create robots. Drawing upon these insights, we identified implications for the field of Human-Robot Interaction that can shape the creation of best practices for social robot design.
more »
« less
Exploring the Role of Social Robot Behaviors in a Creative Activity
Robots are increasingly being introduced into domains where they assist or collaborate with human counterparts. There is a growing body of literature on how robots might serve as collaborators in creative activities, but little is known about the factors that shape human perceptions of robots as creative collaborators. This paper investigates the effects of a robot’s social behaviors on people’s creative thinking and their perceptions of the robot. We developed an interactive system to facilitate collaboration between a human and a robot in a creative activity. We conducted a user study (n = 12), in which the robot and adult participants took turns to create compositions using tangram pieces projected on a shared workspace. We observed four human behavioral traits related to creativity in the interaction: accepting robot inputs as inspiration, delegating the creative lead to the robot, communicating creative intents, and being playful in the creation. Our findings suggest designs for co-creation in social robots that consider the adversarial effect of giving the robot too much control in creation, as well as the role playfulness plays in the creative process.
more »
« less
- PAR ID:
- 10273007
- Date Published:
- Journal Name:
- Designing Interactive Systems Conference (DIS)
- Page Range / eLocation ID:
- 1380 to 1389
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Articulated robots are attracting the attention of artists worldwide. Due to their precise, tireless, and efficient nature, robots are now being deployed in different forms of creative expression, such as sculpting, choreography, immersive environments, and cinematography. While there is a growing interest among artists in robotics, programming such machines is a challenge for most professionals in the field, as robots require extensive coding experience and are primarily designed for industrial applications and environments. To enable artists to incorporate robots in their projects, we propose an end-user-friendly robot programming solution using an intuitive spatial computing environment designed for Microsoft Hololens 2. In our application, the robot movements are synchronized with a hologram via network communication. Using natural hand gestures, users can manipulate, animate, and record the hologram similar to 3D animation software, including the advantages of mixed reality interaction. Our solution not only gives artists the ability to translate their creative ideas and movements to an industrial machine but also makes human-robot interaction safer, as robots can now be accurately and effectively operated from a distance. We consider this an important step in a more human-driven robotics community, allowing creators without robot programming experience to easily script and perform complex sequences of robotic movement in service of new arts applications. Making robots more collaborative and safer for humans to interact with dramatically increases their utility, exposure, and potential for social interaction, opens new markets, expands creative industries, and directly locates them in highly visible public spaces.more » « less
-
null (Ed.)Research in creative robotics continues to expand across all creative domains, including art, music and language. Creative robots are primarily designed to be task specific, with limited research into the implications of their design outside their core task. In the case of a musical robot, this includes when a human sees and interacts with the robot before and after the performance, as well as in between pieces. These non-musical interaction tasks such as the presence of a robot during musical equipment set up, play a key role in the human perception of the robot however have received only limited attention. In this paper, we describe a new audio system using emotional musical prosody, designed to match the creative process of a musical robot for use before, between and after musical performances. Our generation system relies on the creation of a custom dataset for musical prosody. This system is designed foremost to operate in real time and allow rapid generation and dialogue exchange between human and robot. For this reason, the system combines symbolic deep learning through a Conditional Convolution Variational Auto-encoder, with an emotion-tagged audio sampler. We then compare this to a SOTA text-to-speech system in our robotic platform, Shimon the marimba player.We conducted a between-groups study with 100 participants watching a musician interact for 30 s with Shimon. We were able to increase user ratings for the key creativity metrics; novelty and coherence, while maintaining ratings for expressivity across each implementation. Our results also indicated that by communicating in a form that relates to the robot’s core functionality, we can raise likeability and perceived intelligence, while not altering animacy or anthropomorphism. These findings indicate the variation that can occur in the perception of a robot based on interactions surrounding a performance, such as initial meetings and spaces between pieces, in addition to the core creative algorithms.more » « less
-
Today’s teens will most likely be the first generation to spend a lifetime living and interacting with both mechanical and social robots. Although human-robot interaction has been explored in children, adults, and seniors, examination of teen-robot interaction has been minimal. In this paper, we provide evidence that teenrobot interaction is a unique area of inquiry and designing for teens is categorically different from other types of human-robot interaction. Using human-centered design, our team is developing a social robot to gather stress and mood data from teens in a public high school. To better understand teen-robot interaction, we conducted an interaction study in the wild to explore and capture teens’ interactions with a low-fidelity social robot prototype. Then, through group interviews we gathered data regarding their perceptions about social robots. Although we anticipated minimal engagement due to the low fidelity of our prototype, teens showed strong engagement and lengthy interactions. Additionally, teens expressed thoughtful articulations of how a social robot could be emotionally supportive. We conclude the paper by discussing future areas for consideration when designing for teen-robot interaction.more » « less
-
In Human–Robot Interaction, researchers typically utilize in-person studies to collect subjective perceptions of a robot. In addition, videos of interactions and interactive simulations (where participants control an avatar that interacts with a robot in a virtual world) have been used to quickly collect human feedback at scale. How would human perceptions of robots compare between these methodologies? To investigate this question, we conducted a 2x2 between-subjects study (N=160), which evaluated the effect of the interaction environment (Real vs. Simulated environment) and participants’ interactivity during human-robot encounters (Interactive participation vs. Video observations) on perceptions about a robot (competence, discomfort, social presentation, and social information processing) for the task of navigating in concert with people. We also studied participants’ workload across the experimental conditions. Our results revealed a significant difference in the perceptions of the robot between the real environment and the simulated environment. Furthermore, our results showed differences in human perceptions when people watched a video of an encounter versus taking part in the encounter. Finally, we found that simulated interactions and videos of the simulated encounter resulted in a higher workload than real-world encounters and videos thereof. Our results suggest that findings from video and simulation methodologies may not always translate to real-world human–robot interactions. In order to allow practitioners to leverage learnings from this study and future researchers to expand our knowledge in this area, we provide guidelines for weighing the tradeoffs between different methodologies.more » « less
An official website of the United States government

