Despite a recent surge in research examining parent–child neural similarity using fMRI, there remains a need for further investigation into how such similarity may play a role in children's emotional adjustment. Moreover, no prior studies explored the potential contextual factors that may moderate the link between parent–child neural similarity and children's developmental outcomes. In this study, 32 parent–youth dyads (parents:Mage= 43.53 years, 72% female; children:Mage= 11.69 years, 41% female) watched an emotion-evoking animated film while being scanned using fMRI. We first quantified how similarly emotion network interacts with other brain regions in responding to the emotion-evoking film between parents and their children. We then examined how such parent–child neural similarity is associated with children's emotional adjustment, with attention to the moderating role of family cohesion. Results revealed that higher parent–child similarity in functional connectivity pattern during movie viewing was associated with better emotional adjustment, including less negative affect, lower anxiety, and greater ego resilience in youth. Moreover, such associations were significant only among families with higher cohesion, but not among families with lower cohesion. The findings advance our understanding of the neural mechanisms underlying how children thrive by being in sync and attuned with their parents, and provide novel empirical evidence that the effects of parent–child concordance at the neural level on children's development are contextually dependent. SIGNIFICANCE STATEMENTWhat neural processes underlie the attunement between children and their parents that helps children thrive? Using a naturalistic movie-watching fMRI paradigm, we find that greater parent–child similarity in how emotion network interacts with other brain regions during movie viewing is associated with youth's better emotional adjustment including less negative affect, lower anxiety, and greater ego resilience. Interestingly, these associations are only significant among families with higher cohesion, but not among those with lower cohesion. Our findings provide novel evidence that parent–child shared neural processes to emotional situations can confer benefits to children, and underscore the importance of considering specific family contexts in which parent–child neural similarity may be beneficial or detrimental to children's development, highlighting a crucial direction for future research.
more »
« less
In-the-Wild Affect Analysis of Children with ASD Using Heart Rate
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with ASD is crucial for providing personalized interventions and support. Existing methods often rely on controlled lab environments, limiting their applicability to real-world scenarios. Hence, a framework that enables the recognition of affective states in children with ASD in uncontrolled settings is needed. This paper presents a framework for recognizing the affective state of children with ASD in an in-the-wild setting using heart rate (HR) information. More specifically, an algorithm is developed that can classify a participant’s emotion as positive, negative, or neutral by analyzing the heart rate signal acquired from a smartwatch. The heart rate data are obtained in real time using a smartwatch application while the child learns to code a robot and interacts with an avatar. The avatar assists the child in developing communication skills and programming the robot. In this paper, we also present a semi-automated annotation technique based on facial expression recognition for the heart rate data. The HR signal is analyzed to extract features that capture the emotional state of the child. Additionally, in this paper, the performance of a raw HR-signal-based emotion classification algorithm is compared with a classification approach based on features extracted from HR signals using discrete wavelet transform (DWT). The experimental results demonstrate that the proposed method achieves comparable performance to state-of-the-art HR-based emotion recognition techniques, despite being conducted in an uncontrolled setting rather than a controlled lab environment. The framework presented in this paper contributes to the real-world affect analysis of children with ASD using HR information. By enabling emotion recognition in uncontrolled settings, this approach has the potential to improve the monitoring and understanding of the emotional well-being of children with ASD in their daily lives.
more »
« less
- Award ID(s):
- 2114808
- PAR ID:
- 10501758
- Publisher / Repository:
- MDPI
- Date Published:
- Journal Name:
- Sensors
- Volume:
- 23
- Issue:
- 14
- ISSN:
- 1424-8220
- Page Range / eLocation ID:
- 6572
- Subject(s) / Keyword(s):
- Transformers Biosensors Multi-modal representation learning
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract Longstanding theories of emotion socialization postulate that caregiver emotional and behavioral reactions to a child's emotions together shape the child's emotion displays over time. Despite the notable importance of positive valence system function, the majority of research on caregiver emotion socialization focuses on negative valence system emotions. In the current project, we leveraged a relatively large cross‐sectional study of caregivers (N = 234; 93.59% White) of preschool aged children to investigate whether and to what degree, caregiver (1) emotional experiences, or (2) external behaviors, in the context of preschoolers’ positive emotion displays in caregiver–child interactions, are associated with children's general positive affect tendencies. Results indicated that, in the context of everyday caregiver–child interactions, caregiver‐reported positively valenced emotions but not approach behaviors were positively associated with child general positive affect tendencies. However, when examining specific caregiver behaviors in response to everyday child positive emotion displays, caregiver report of narrating the child's emotion and joining in the emotion with their child was positively associated with child general positive affect tendencies. Together, these results suggest that in everyday caregiver–child interactions, caregivers’ emotional experiences and attunement with the child play a role in shaping preschoolers’ overall tendencies toward positive affect.more » « less
-
Background Autism spectrum disorder (ASD) is a developmental disorder characterized by deficits in social communication and interaction, and restricted and repetitive behaviors and interests. The incidence of ASD has increased in recent years; it is now estimated that approximately 1 in 40 children in the United States are affected. Due in part to increasing prevalence, access to treatment has become constrained. Hope lies in mobile solutions that provide therapy through artificial intelligence (AI) approaches, including facial and emotion detection AI models developed by mainstream cloud providers, available directly to consumers. However, these solutions may not be sufficiently trained for use in pediatric populations. Objective Emotion classifiers available off-the-shelf to the general public through Microsoft, Amazon, Google, and Sighthound are well-suited to the pediatric population, and could be used for developing mobile therapies targeting aspects of social communication and interaction, perhaps accelerating innovation in this space. This study aimed to test these classifiers directly with image data from children with parent-reported ASD recruited through crowdsourcing. Methods We used a mobile game called Guess What? that challenges a child to act out a series of prompts displayed on the screen of the smartphone held on the forehead of his or her care provider. The game is intended to be a fun and engaging way for the child and parent to interact socially, for example, the parent attempting to guess what emotion the child is acting out (eg, surprised, scared, or disgusted). During a 90-second game session, as many as 50 prompts are shown while the child acts, and the video records the actions and expressions of the child. Due in part to the fun nature of the game, it is a viable way to remotely engage pediatric populations, including the autism population through crowdsourcing. We recruited 21 children with ASD to play the game and gathered 2602 emotive frames following their game sessions. These data were used to evaluate the accuracy and performance of four state-of-the-art facial emotion classifiers to develop an understanding of the feasibility of these platforms for pediatric research. Results All classifiers performed poorly for every evaluated emotion except happy. None of the classifiers correctly labeled over 60.18% (1566/2602) of the evaluated frames. Moreover, none of the classifiers correctly identified more than 11% (6/51) of the angry frames and 14% (10/69) of the disgust frames. Conclusions The findings suggest that commercial emotion classifiers may be insufficiently trained for use in digital approaches to autism treatment and treatment tracking. Secure, privacy-preserving methods to increase labeled training data are needed to boost the models’ performance before they can be used in AI-enabled approaches to social therapy of the kind that is common in autism treatments.more » « less
-
In recent news, organizations have been considering the use of facial and emotion recognition for applications involving youth such as tackling surveillance and security in schools. However, the majority of efforts on facial emotion recognition research have focused on adults. Children, particularly in their early years, have been shown to express emotions quite differently than adults. Thus, before such algorithms are deployed in environments that impact the wellbeing and circumstance of youth, a careful examination should be made on their accuracy with respect to appropriateness for this target demographic. In this work, we utilize several datasets that contain facial expressions of children linked to their emotional state to evaluate eight different commercial emotion classification systems. We compare the ground truth labels provided by the respective datasets to the labels given with the highest confidence by the classification systems and assess the results in terms of matching score (TPR), positive predictive value, and failure to compute rate. Overall results show that the emotion recognition systems displayed subpar performance on the datasets of children's expressions compared to prior work with adult datasets and initial human ratings. We then identify limitations associated with automated recognition of emotions in children and provide suggestions on directions with enhancing recognition accuracy through data diversification, dataset accountability, and algorithmic regulation.more » « less
-
The majority of research on infants’ and children’s understanding of emotional expressions has focused on their abilities to use emotional expressions to infer how other people feel. However, an emerging body of work suggests that emotional expressions support rich, powerful inferences not just about emotional states but also about other unobserved states, such as hidden events in the physical world and mental states of other people (e.g., beliefs and desires). Here we argue that infants and children harness others’ emotional expressions as a source of information for learning about the physical and social world broadly. This “emotion as information” framework integrates affective, developmental, and computational cognitive sciences, extending the scope of signals that count as “information” in early learning.more » « less