Social-educational robotics, such as NAO humanoid robots with social, anthropomorphic, humanlike features, are tools for learning, education, and addressing developmental disorders (e.g., autism spectrum disorder or ASD) through social and collaborative robotic interactions and interventions. There are significant gaps at the intersection of social robotics and autism research dealing with how robotic technology helps ASD individuals with their social, emotional, and communication needs, and supports teachers who engage with ASD students. This research aims to (a) obtain new scientific knowledge on social-educational robotics by exploring the usage of social robots (especially humanoids) and robotic interventions with ASD students at high schools through an ASD student–teacher co-working with social robot–social robotic interactions triad framework; (b) utilize Business Model Canvas (BMC) methodology for robot design and curriculum development targeted at ASD students; and (c) connect interdisciplinary areas of consumer behavior research, social robotics, and human-robot interaction using customer discovery interviews for bridging the gap between academic research on social robotics on the one hand, and industry development and customers on the other. The customer discovery process in this research results in eight core research propositions delineating the contexts that enable a higher quality learning environment corresponding with ASD students’ learning requirements through the use of social robots and preparing them for future learning and workforce environments.
more »
« less
RAISE: Robotics & AI to improve STEM and social skills for elementary school students
The authors present the design and implementation of an exploratory virtual learning environment that assists children with autism (ASD) in learning science, technology, engineering, and mathematics (STEM) skills along with improving social-emotional and communication skills. The primary contribution of this exploratory research is how educational research informs technological advances in triggering a virtual AI companion (AIC) for children in need of social-emotional and communication skills development. The AIC adapts to students’ varying levels of needed support. This project began by using puppetry control (human-in-the-loop) of the AIC, assisting students with ASD in learning basic coding, practicing their social skills with the AIC, and attaining emotional recognition and regulation skills for effective communication and learning. The student is given the challenge to program a robot, Dash™, to move in a square. Based on observed behaviors, the puppeteer controls the virtual agent’s actions to support the student in coding the robot. The virtual agent’s actions that inform the development of the AIC include speech, facial expressions, gestures, respiration, and heart color changes coded to indicate emotional state. The paper provides exploratory findings of the first 2 years of this 5-year scaling-up research study. The outcomes discussed align with a common approach of research design used for students with disabilities, called single case study research. This type of design does not involve random control trial research; instead, the student acts as her or his own control subject. Students with ASD have substantial individual differences in their social skill deficits, behaviors, communications, and learning needs, which vary greatly from the norm and from other individuals identified with this disability. Therefore, findings are reported as changes within subjects instead of across subjects. While these exploratory observations serve as a basis for longer term research on a larger population, this paper focuses less on student learning and more on evolving technology in AIC and supporting students with ASD in STEM environments.
more »
« less
- PAR ID:
- 10388874
- Date Published:
- Journal Name:
- Frontiers in Virtual Reality
- Volume:
- 3
- ISSN:
- 2673-4192
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Children diagnosed with autism spectrum disorder (ASD) typically work towards acquiring skills to participate in a regular classroom setting such as attending and appropriately responding to an instructor’s requests. Social robots have the potential to support children with ASD in learning group-interaction skills. However, the majority of studies that target children with ASD’s interactions with social robots have been limited to one-on-one interactions. Group interaction sessions present unique challenges such as the unpredictable behaviors of the other children participating in the group intervention session and shared attention from the instructor. We present the design of a robot-mediated group interaction intervention for children with ASD to enable them to practice the skills required to participate in a classroom. We also present a study investigating differences in children's learning behaviors during robot-led and human-led group interventions over multiple intervention sessions. Results of this study suggests that children with ASD's learning behaviors are similar during human and robot instruction. Furthermore, preliminary results of this study suggest that a novelty effect was not observed when children interacted with the robot over multiple sessions.more » « less
-
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with ASD is crucial for providing personalized interventions and support. Existing methods often rely on controlled lab environments, limiting their applicability to real-world scenarios. Hence, a framework that enables the recognition of affective states in children with ASD in uncontrolled settings is needed. This paper presents a framework for recognizing the affective state of children with ASD in an in-the-wild setting using heart rate (HR) information. More specifically, an algorithm is developed that can classify a participant’s emotion as positive, negative, or neutral by analyzing the heart rate signal acquired from a smartwatch. The heart rate data are obtained in real time using a smartwatch application while the child learns to code a robot and interacts with an avatar. The avatar assists the child in developing communication skills and programming the robot. In this paper, we also present a semi-automated annotation technique based on facial expression recognition for the heart rate data. The HR signal is analyzed to extract features that capture the emotional state of the child. Additionally, in this paper, the performance of a raw HR-signal-based emotion classification algorithm is compared with a classification approach based on features extracted from HR signals using discrete wavelet transform (DWT). The experimental results demonstrate that the proposed method achieves comparable performance to state-of-the-art HR-based emotion recognition techniques, despite being conducted in an uncontrolled setting rather than a controlled lab environment. The framework presented in this paper contributes to the real-world affect analysis of children with ASD using HR information. By enabling emotion recognition in uncontrolled settings, this approach has the potential to improve the monitoring and understanding of the emotional well-being of children with ASD in their daily lives.more » « less
-
STEM education is often disconnected from innovation and design, where students self-identify as solely scientists, artists, or technophiles, but rarely see the connection between the disciplines. The inclusion of arts (A) in STEM education (STEAM) offers an educational approach where students see how subjects are integrated through learning experiences that apply to everyday, developing personal connections and becoming motivated learners who understand how skills from each subject are needed for future careers. This project addresses both the disconnect between science, design, and technology and how high school students can benefit from innovative learning experiences in plant science that integrate these disciplines while gaining invaluable skills for future STEM careers. We used the Science-Art-Design-Technology (SADT) pedagogical approach, characterized by its project-based learning that relies on student teamwork and facilitation by educators. This approach was applied through a STEAM educational 3D plant module where teams: 1) investigated plants under research at a plant science research center, 2) designed and created 3D models of those plants, 3) experienced the application of 3D modeling in augmented and virtual reality platforms, and 4) disseminated project results. We used a mixed-method approach using qualitative and quantitative research methods to assess the impact of the 3D modeling module on students’ understanding of the intersection of art and design with science, learning and skills gains, and interests in STEAM subjects and careers. A total of 160 students from eight educational institutions (schools and informal programs) implemented the module. Student reflection questions revealed that students see art and design playing a role in science mainly by facilitating communication and further understanding and fostering new ideas. They also see science influencing art and design through the artistic creation process. The students acknowledged learning STEAM content and applications associated with plant science, 3D modeling, and augmented and virtual reality. They also acknowledged gaining research skills and soft skills such as collaboration and communication. Students also increased their interest in STEAM subjects and careers, particularly associated with plant science. The SADT approach, exemplified by the 3D plant module, effectively integrates science, art, design, and technology, enhancing student literacy in these fields, and providing students with essential 21st century competencies. The module's flexibility and experiential learning opportunities benefit students and educators, promoting interdisciplinary learning and interest in STEAM subjects and careers. This innovative approach is a valuable tool for educators, inspiring new ways of teaching and learning in STEAM education.more » « less
-
Selecting appropriate tutoring help actions that account for both a student’s content mastery and engagement level is essential for effective human tutors, indicating the critical need for these skills in autonomous tutors. In this work, we formulate the robot-student tutoring help action selection problem as the Assistive Tutor partially observable Markov decision process (AT-POMDP). We designed the AT-POMDP and derived its parameters based on data from a prior robot-student tutoring study. The policy that results from solving the ATPOMDP allows a robot tutor to decide upon the optimal tutoring help action to give a student, while maintaining a belief of the student’s mastery of the material and engagement with the task. This approach is validated through a between-subjects field study, which involved 4th grade students (n = 28) interacting with a social robot solving long division problems over five sessions. Students who received help from a robot using the AT-POMDP policy demonstrated significantly greater learning gains than students who received help from a robot with a fixed help action selection policy. Our results demonstrate that this robust computational framework can be used effectively to deliver diverse and personalized tutoring support over time for students.more » « less