skip to main content

Attention:

The NSF Public Access Repository (PAR) system and access will be unavailable from 11:00 PM ET on Thursday, February 13 until 2:00 AM ET on Friday, February 14 due to maintenance. We apologize for the inconvenience.


This content will become publicly available on July 12, 2025

Title: From Reaction to Anticipation: Predicting Future Affect
The educational data mining community has extensively investigated affect detection in learning platforms, finding associations between affective states and a wide range of learning outcomes. Based on these insights, several studies have used affect detectors to create interventions tailored to respond to when students are bored, confused, or frustrated. However, these detector-based interventions have depended on detecting affect when it occurs and therefore inherently respond to affective states after they have begun. This might not always be soon enough to avoid a negative experience for the student. In this paper, we aim to predict students' affective states in advance. Within our approach, we attempt to determine the maximum prediction window where detector performance remains sufficiently high, documenting the decay in performance when this prediction horizon is increased. Our results indicate that it is possible to predict confusion, frustration, and boredom in advance with performance over chance for prediction horizons of 120, 40, and 50 seconds, respectively. These findings open the door to designing more timely interventions.  more » « less
Award ID(s):
1917545
PAR ID:
10546433
Author(s) / Creator(s):
; ; ; ;
Editor(s):
Benjamin, Paaßen; Carrie, Demmans Epp
Publisher / Repository:
International Educational Data Mining Society
Date Published:
Format(s):
Medium: X
Right(s):
Creative Commons Attribution 4.0 International
Sponsoring Org:
National Science Foundation
More Like this
  1. A significant amount of research has illustrated the impact of student emotional and affective state on learning outcomes. Just as human teachers and tutors often adapt instruction to accommodate changes in student affect, the ability for computer-based systems to similarly become affect-aware, detecting and personalizing instruction in response to student affective state, could significantly improve student learning. Personalized and affective interventions in tutoring systems can be realized through affect-aware learning technologies to deter students from practicing poor learning behaviors in response to negative affective states and to optimize the amount of learning that occurs over time. In this paper, we build off previous work in affect detection within intelligent tutoring systems (ITS) by applying two methodologies to develop sensor-free models of student affect with only data recorded from middle-school students interacting with an ITS. We develop models of four affective states to evaluate and determine significant predictors of affect. Namely, we develop a model which discerns students’ reported interest significantly better than majority class. 
    more » « less
  2. Agents must monitor their partners' affective states continuously in order to understand and engage in social interactions. However, methods for evaluating affect recognition do not account for changes in classification performance that may occur during occlusions or transitions between affective states. This paper addresses temporal patterns in affect classification performance in the context of an infant-robot interaction, where infants’ affective states contribute to their ability to participate in a therapeutic leg movement activity. To support robustness to facial occlusions in video recordings, we trained infant affect recognition classifiers using both facial and body features. Next, we conducted an in-depth analysis of our best-performing models to evaluate how performance changed over time as the models encountered missing data and changing infant affect. During time windows when features were extracted with high confidence, a unimodal model trained on facial features achieved the same optimal performance as multimodal models trained on both facial and body features. However, multimodal models outperformed unimodal models when evaluated on the entire dataset. Additionally, model performance was weakest when predicting an affective state transition and improved after multiple predictions of the same affective state. These findings emphasize the benefits of incorporating body features in continuous affect recognition for infants. Our work highlights the importance of evaluating variability in model performance both over time and in the presence of missing data when applying affect recognition to social interactions. 
    more » « less
  3. This paper studies the hypothesis that not all modalities are always needed to predict affective states. We explore this hypothesis in the context of recognizing three affective states that have shown a relation to a future onset of depression: positive, aggressive, and dysphoric. In particular, we investigate three important modali- ties for face-to-face conversations: vision, language, and acoustic modality. We first perform a human study to better understand which subset of modalities people find informative, when recog- nizing three affective states. As a second contribution, we explore how these human annotations can guide automatic affect recog- nition systems to be more interpretable while not degrading their predictive performance. Our studies show that humans can reliably annotate modality informativeness. Further, we observe that guided models significantly improve interpretability, i.e., they attend to modalities similarly to how humans rate the modality informative- ness, while at the same time showing a slight increase in predictive performance. 
    more » « less
  4. Benjamin, Paaßen ; Carrie, Demmans Epp (Ed.)
    Open-ended learning environments (OELEs) have become an important tool for promoting constructivist STEM learning. OELEs are known to promote student engagement and facilitate a deeper understanding of STEM topics. Despite their benefits, OELEs present significant challenges to novice learners who may lack the self-regulated learning (SRL) processes they need to become effective learners and problem solvers. Recent studies have revealed the importance of the relationship between students' affective states, cognitive processes, and performance in OELEs. Yet, the relations between students' use of cognitive processes and their corresponding affective states have not been studied in detail. In this paper, we investigate the relations between studentsż˝f affective states and the coherence in their cognitive strategies as they work on developing causal models of scientific processes in the XYZ OELE. Our analyses and results demonstrate that there are significant differences in the coherence of cognitive strategies used by high- and low-performing students. As a result, there are also significant differences in the affective states of the high- and low-performing students that are related to the coherence of their cognitive activities. This research contributes valuable empirical evidence on studentsż˝f cognitive-affective dynamics in OELEs, emphasizing the subtle ways in which students' understanding of their cognitive processes impacts their emotional reactions in learning environments. 
    more » « less
  5. In this work, we propose a video-based transfer learning approach for predicting problem outcomes of students working with an intelligent tutoring system (ITS). By analyzing a student's face and gestures, our method predicts the outcome of a student answering a problem in an ITS from a video feed. Our work is motivated by the reasoning that the ability to predict such outcomes enables tutoring systems to adjust interventions, such as hints and encouragement, and to ultimately yield improved student learning. We collected a large labeled dataset of student interactions with an intelligent online math tutor consisting of 68 sessions, where 54 individual students solved 2,749 problems. We will release this dataset publicly upon publication of this paper. It will be available at https://www.cs.bu.edu/faculty/betke/research/learning/. Working with this dataset, our transfer-learning challenge was to design a representation in the source domain of pictures obtained “in the wild” for the task of facial expression analysis, and transferring this learned representation to the task of human behavior prediction in the domain of webcam videos of students in a classroom environment. We developed a novel facial affect representation and a user-personalized training scheme that unlocks the potential of this representation. We designed several variants of a recurrent neural network that models the temporal structure of video sequences of students solving math problems. Our final model, named ATL-BP for Affect Transfer Learning for Behavior Prediction, achieves a relative increase in mean F -score of 50 % over the state-of-the-art method on this new dataset. 
    more » « less