Abstract Background Many children with autism cannot receive timely in-person diagnosis and therapy, especially in situations where access is limited by geography, socioeconomics, or global health concerns such as the current COVD-19 pandemic. Mobile solutions that work outside of traditional clinical environments can safeguard against gaps in access to quality care. Objective The aim of the study is to examine the engagement level and therapeutic feasibility of a mobile game platform for children with autism. Methods We designed a mobile application, GuessWhat, which, in its current form, delivers game-based therapy to children aged 3 to 12 in home settings through a smartphone. The phone, held by a caregiver on their forehead, displays one of a range of appropriate and therapeutically relevant prompts (e.g., a surprised face) that the child must recognize and mimic sufficiently to allow the caregiver to guess what is being imitated and proceed to the next prompt. Each game runs for 90 seconds to create a robust social exchange between the child and the caregiver. Results We examined the therapeutic feasibility of GuessWhat in 72 children (75% male, average age 8 years 2 months) with autism who were asked to play the game for three 90-second sessions per day, 3 days per week, for a total of 4 weeks. The group showed significant improvements in Social Responsiveness Score-2 (SRS-2) total (3.97, p <0.001) and Vineland Adaptive Behavior Scales-II (VABS-II) socialization standard (5.27, p = 0.002) scores. Conclusion The results support that the GuessWhat mobile game is a viable approach for efficacious treatment of autism and further support the possibility that the game can be used in natural settings to increase access to treatment when barriers to care exist.
more »
« less
The Performance of Emotion Classifiers for Children With Parent-Reported Autism: Quantitative Feasibility Study
Background Autism spectrum disorder (ASD) is a developmental disorder characterized by deficits in social communication and interaction, and restricted and repetitive behaviors and interests. The incidence of ASD has increased in recent years; it is now estimated that approximately 1 in 40 children in the United States are affected. Due in part to increasing prevalence, access to treatment has become constrained. Hope lies in mobile solutions that provide therapy through artificial intelligence (AI) approaches, including facial and emotion detection AI models developed by mainstream cloud providers, available directly to consumers. However, these solutions may not be sufficiently trained for use in pediatric populations. Objective Emotion classifiers available off-the-shelf to the general public through Microsoft, Amazon, Google, and Sighthound are well-suited to the pediatric population, and could be used for developing mobile therapies targeting aspects of social communication and interaction, perhaps accelerating innovation in this space. This study aimed to test these classifiers directly with image data from children with parent-reported ASD recruited through crowdsourcing. Methods We used a mobile game called Guess What? that challenges a child to act out a series of prompts displayed on the screen of the smartphone held on the forehead of his or her care provider. The game is intended to be a fun and engaging way for the child and parent to interact socially, for example, the parent attempting to guess what emotion the child is acting out (eg, surprised, scared, or disgusted). During a 90-second game session, as many as 50 prompts are shown while the child acts, and the video records the actions and expressions of the child. Due in part to the fun nature of the game, it is a viable way to remotely engage pediatric populations, including the autism population through crowdsourcing. We recruited 21 children with ASD to play the game and gathered 2602 emotive frames following their game sessions. These data were used to evaluate the accuracy and performance of four state-of-the-art facial emotion classifiers to develop an understanding of the feasibility of these platforms for pediatric research. Results All classifiers performed poorly for every evaluated emotion except happy. None of the classifiers correctly labeled over 60.18% (1566/2602) of the evaluated frames. Moreover, none of the classifiers correctly identified more than 11% (6/51) of the angry frames and 14% (10/69) of the disgust frames. Conclusions The findings suggest that commercial emotion classifiers may be insufficiently trained for use in digital approaches to autism treatment and treatment tracking. Secure, privacy-preserving methods to increase labeled training data are needed to boost the models’ performance before they can be used in AI-enabled approaches to social therapy of the kind that is common in autism treatments.
more »
« less
- Award ID(s):
- 2014232
- PAR ID:
- 10462127
- Date Published:
- Journal Name:
- JMIR Mental Health
- Volume:
- 7
- Issue:
- 4
- ISSN:
- 2368-7959
- Page Range / eLocation ID:
- e13174
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with ASD is crucial for providing personalized interventions and support. Existing methods often rely on controlled lab environments, limiting their applicability to real-world scenarios. Hence, a framework that enables the recognition of affective states in children with ASD in uncontrolled settings is needed. This paper presents a framework for recognizing the affective state of children with ASD in an in-the-wild setting using heart rate (HR) information. More specifically, an algorithm is developed that can classify a participant’s emotion as positive, negative, or neutral by analyzing the heart rate signal acquired from a smartwatch. The heart rate data are obtained in real time using a smartwatch application while the child learns to code a robot and interacts with an avatar. The avatar assists the child in developing communication skills and programming the robot. In this paper, we also present a semi-automated annotation technique based on facial expression recognition for the heart rate data. The HR signal is analyzed to extract features that capture the emotional state of the child. Additionally, in this paper, the performance of a raw HR-signal-based emotion classification algorithm is compared with a classification approach based on features extracted from HR signals using discrete wavelet transform (DWT). The experimental results demonstrate that the proposed method achieves comparable performance to state-of-the-art HR-based emotion recognition techniques, despite being conducted in an uncontrolled setting rather than a controlled lab environment. The framework presented in this paper contributes to the real-world affect analysis of children with ASD using HR information. By enabling emotion recognition in uncontrolled settings, this approach has the potential to improve the monitoring and understanding of the emotional well-being of children with ASD in their daily lives.more » « less
-
Robot-mediated interventions have been investigated for the treatment of social skill deficits amongst children with Autism Spectrum Disorder (ASD). Does the use of a Nao robot as a mediator increase vocal interaction between children with ASD? The present study examined the vocalization and turn-taking rate in six children with ASD (mean age = 11.4 years, SD = 0.86 years) interacting with and without a Nao robot for 10 sessions, order counterbalanced. Each session lasted nine minutes. In the Robot condition, the robot provided vocal prompts; in the No Robot condition, children interacted freely. Child vocalization and turn-taking rate defined as the number of utterances/turns per second were measured. Results demonstrated that three children produced higher vocalization and turn-taking rates when a robot was present, and two when it was absent. One participant produced higher vocalization rates when the robot was not present, but more conversational turns when the robot was present. The findings suggest that the use of a Nao robot as a social mediator increases vocalization and turn-taking rates among children with ASD, but large individual variability is observed. The effect of the robot as a mediator on lexical diversity of child speech will also be investigated.more » « less
-
This work is motivated by the need to automate the analysis of parent-infant interactions to better understand the existence of any potential behavioral patterns useful for the early diagnosis of autism spectrum disorder (ASD). It presents an approach for synthesizing the facial expression exchanges that occur during parent-infant interactions. This is accomplished by developing a novel approach that uses landmarks when synthesizing changing facial expressions. The proposed model consists of two components: (i) The first is a landmark converter that receives a set of facial landmarks and the target emotion as input and outputs a set of new landmarks transformed to match the emotion. (ii) The second component involves an image converter that takes in an input image, a target landmark and a target emotion and outputs a face transformed to match the input emotion. The inclusion of landmarks in the generation process proves useful in the generation of baby facial expressions; babies have somewhat different facial musculature and facial dynamics than adults. This paper presents a realistic-looking matrix of changing facial expressions sampled from a 2-D emotion continuum (valence and arousal) and displays successfully transferred facial expressions from real-life mother-infant dyads to novel ones.more » « less
-
null (Ed.)Children acquire and develop emotional regulatory skills in the context of parent-child attachment relationships, nonetheless empirical studies have focused mainly on mother and less information is available regarding the role of both parent-child attachment relationships. Furthermore, despite its importance, there is no information regarding preschool years. This study aims to fill this gap by exploring the potential influences of both mother-child and father-child attachments on preschooler’s later emotion regulation observed in the peer group. Fifty-three Portuguese nuclear families (mother, father and focal child) participated in the study; 47% of the children were boys and 53% were girls. Attachment Security was assessed at home using the Attachment Behavior Q-set when children were 3 years of age, and emotion regulation was observed in the preschool classrooms attended by the children at age 5, using the California child Q-sort to derive an Emotion Regulation Q-Scale. Results showed that the combined influence of both parent-child attachment security predicted better emotion regulation results, than did the specific contributions of each parent per se. Findings are consistent with integrative approaches that highlight the value of including both mother- and father-child attachment relationships, as well as their combined effect, when studying emotion regulation.more » « less
An official website of the United States government

