skip to main content

Title: The Unboxing Experience: Exploration and Design of Initial Interactions Between Children and Social Robots
Social robots are increasingly introduced into children’s lives as educational and social companions, yet little is known about how these products might best be introduced to their environments. The emergence of the “unboxing” phenomenon in media suggests that introduction is key to technology adoption where initial impressions are made. To better understand this phenomenon toward designing a positive unboxing experience in the context of social robots for children, we conducted three field studies with families of children aged 8 to 13: (1) an exploratory free-play activity (n = 12); (2) a co-design session (n = 11) that informed the development of a prototype box and a curated unboxing experience; and (3) a user study (n = 9) that evaluated children’s experiences. Our findings suggest the unboxing experience of social robots can be improved through the design of a creative aesthetic experience that engages the child socially to guide initial interactions and foster a positive child-robot relationship.
; ;
Award ID(s):
Publication Date:
Journal Name:
Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems (CHI '22)
Page Range or eLocation-ID:
1 to 14
Sponsoring Org:
National Science Foundation
More Like this
  1. Prior work in affect-aware educational robots has often relied on a common belief that the relationship between student affect and learning is independent of agent behaviors (child’s/robot’s) or unidirectional (positive/negative but not both) throughout the entire student-robot interaction.We argue that the student affect-learning relationship should be interpreted in two contexts: (1) social learning paradigm and (2) sub-events within child-robot interaction. In our paper, we examine two different social learning paradigms where children interact with a robot that acts either as a tutor or a tutee. Sub-events within child-robot interaction are defined as task-related events occurring in specific phases of an interaction (e.g., when the child/robot gets a wrong answer). We examine subevents at a macro level (entire interaction) and a micro level (within specific sub-events). In this paper, we provide an in-depth correlation analysis of children’s facial affect and vocabulary learning. We found that children’s affective displays became more predictive of their vocabulary learning when children interacted with a tutee robot who did not scaffold their learning. Additionally, children’s affect displayed during micro-level events was more predictive of their learning than during macro-level events. Last, we found that the affect-learning relationship is not unidirectional, but rather is modulated by context,more »i.e., several affective states facilitated student learning when displayed in some sub-events but inhibited learning when displayed in others. These findings indicate that both social learning paradigm and sub-events within interaction modulate student affect-learning relationship.« less
  2. Research in child-robot interactions suggests that engaging in “care-taking” of a social robot, such as tucking the robot in at night, can strengthen relationships formed between children and robots. In this work, we aim to better understand and explore the design space of caretaking activities with 10 children, aged 8–12 from eight families, involving an exploratory design session followed by a preliminary feasibility testing of robot caretaking activities. The design sessions provided insight into children’s current caretaking tasks, how they would take care of a social robot, and how these new caretaking activities could be integrated into their daily routines. The feasibility study tested two different types of robot caretaking tasks, which we call connection and utility, and measured their short term effects on children’s perceptions of and closeness to the social robot. We discuss the themes and present interaction design guidelines of robot caretaking activities for children.
  3. Children’s early numerical knowledge establishes a foundation for later development of mathematics achievement and playing linear number board games is effective in improving basic numeri- cal abilities. Besides the visuo-spatial cues provided by traditional number board games, learning companion robots can integrate multi-sensory information and offer social cues that can support children’s learning experiences. We explored how young children experience sensory feedback (audio and visual) and social expressions from a robot when playing a linear number board game, “RoboMath.” We present the interaction design of the game and our investigation of children’s (n = 19, aged 4) and parents’ experiences under three conditions: (1) visual-only, (2) audio-visual, and (3) audio- visual-social robot interaction. We report our qualitative analysis, including the themes observed from interviews with families on their perceptions of the game and the interaction with the robot, their child’s experiences, and their design recommendations.
  4. Autonomous educational social robots can be used to help promote literacy skills in young children. Such robots, which emulate the emotive, perceptual, and empathic abilities of human teachers, are capable of replicating some of the benefits of one-on-one tutoring from human teachers, in part by leveraging individual student’s behavior and task performance data to infer sophisticated models of their knowledge. These student models are then used to provide personalized educational experiences by, for example, determining the optimal sequencing of curricular material. In this paper, we introduce an integrated system for autonomously analyzing and assessing children’s speech and pronunciation in the context of an interactive word game between a social robot and a child. We present a novel game environment and its computational formulation, an integrated pipeline for capturing and analyzing children’s speech in real-time, and an autonomous robot that models children’s word pronunciation via Gaussian Process Regression (GPR), augmented with an Active Learning protocol that informs the robot’s behavior. We show that the system is capable of autonomously assessing children’s pronunciation ability, with ground truth determined by a post-experiment evaluation by human raters. We also compare phoneme- and word-level GPR models and discuss trade-offs of each approach in modeling children’smore »pronunciation. Finally, we describe and analyze a pipeline for automatic analysis of children’s speech and pronunciation, including an evaluation of Speech Ace as a tool for future development of autonomous, speech-based language tutors.« less
  5. Abstract

    Language development is an important facet of early life. Deaf children may have exposure to various languages and communication modalities, including spoken and visual. Previous research has documented the rate of growth of English skills among young deaf children, but no studies have investigated the rate of ASL acquisition. The current paper examines young deaf children’s acquisition of ASL skills, the rate of growth over time, and factors impacting levels and growth rates. Seventy-three children ages birth to 5 were rated three times using the Visual Communication and Sign Language Checklist and given a scaled score at each rating. An average monthly gain score was calculated for each participant. The presence of a deaf parent, use of ASL at home, use of cochlear implant(s), whether the child was born deaf, and age of initial diagnosis were analyzed for their impact on the level of ASL skill and rate of growth. Results indicated that the use of ASL in the home has a significant positive effect on deaf children’s ASL skill level. Additionally, children with lower initial ratings showed higher rates of growth than those with higher initial ratings, especially among school-aged children. The paper discusses implications and directions formore »future studies.

    « less