Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
AbstractRecent advances in generative artificial intelligence (AI) and multimodal learning analytics (MMLA) have allowed for new and creative ways of leveraging AI to support K12 students' collaborative learning in STEM+C domains. To date, there is little evidence of AI methods supporting students' collaboration in complex, open‐ended environments. AI systems are known to underperform humans in (1) interpreting students' emotions in learning contexts, (2) grasping the nuances of social interactions and (3) understanding domain‐specific information that was not well‐represented in the training data. As such, combined human and AI (ie, hybrid) approaches are needed to overcome the current limitations of AI systems. In this paper, we take a first step towards investigating how a human‐AI collaboration between teachers and researchers using an AI‐generated multimodal timeline can guide and support teachers' feedback while addressing students' STEM+C difficulties as they work collaboratively to build computational models and solve problems. In doing so, we present a framework characterizing the human component of our human‐AI partnership as a collaboration between teachers and researchers. To evaluate our approach, we present our timeline to a high school teacher and discuss the key insights gleaned from our discussions. Our case study analysis reveals the effectiveness of an iterative approach to using human‐AI collaboration to address students' STEM+C challenges: the teacher can use the AI‐generated timeline to guide formative feedback for students, and the researchers can leverage the teacher's feedback to help improve the multimodal timeline. Additionally, we characterize our findings with respect to two events of interest to the teacher: (1) when the students cross adifficulty threshold,and (2) thepoint of intervention, that is, when the teacher (or system) should intervene to provide effective feedback. It is important to note that the teacher explained that there should be a lag between (1) and (2) to give students a chance to resolve their own difficulties. Typically, such a lag is not implemented in computer‐based learning environments that provide feedback. Practitioner notesWhat is already known about this topicCollaborative, open‐ended learning environments enhance students' STEM+C conceptual understanding and practice, but they introduce additional complexities when students learn concepts spanning multiple domains.Recent advances in generative AI and MMLA allow for integrating multiple datastreams to derive holistic views of students' states, which can support more informed feedback mechanisms to address students' difficulties in complex STEM+C environments.Hybrid human‐AI approaches can help address collaborating students' STEM+C difficulties by combining the domain knowledge, emotional intelligence and social awareness of human experts with the general knowledge and efficiency of AI.What this paper addsWe extend a previous human‐AI collaboration framework using a hybrid intelligence approach to characterize the human component of the partnership as a researcher‐teacher partnership and present our approach as a teacher‐researcher‐AI collaboration.We adapt an AI‐generated multimodal timeline to actualize our human‐AI collaboration by pairing the timeline with videos of students encountering difficulties, engaging in active discussions with a high school teacher while watching the videos to discern the timeline's utility in the classroom.From our discussions with the teacher, we define two types ofinflection pointsto address students' STEM+C difficulties—thedifficulty thresholdand theintervention point—and discuss how thefeedback latency intervalseparating them can inform educator interventions.We discuss two ways in which our teacher‐researcher‐AI collaboration can help teachers support students encountering STEM+C difficulties: (1) teachers using the multimodal timeline to guide feedback for students, and (2) researchers using teachers' input to iteratively refine the multimodal timeline.Implications for practice and/or policyOur case study suggests that timeline gaps (ie, disengaged behaviour identified by off‐screen students, pauses in discourse and lulls in environment actions) are particularly important for identifying inflection points and formulating formative feedback.Human‐AI collaboration exists on a dynamic spectrum and requires varying degrees of human control and AI automation depending on the context of the learning task and students' work in the environment.Our analysis of this human‐AI collaboration using a multimodal timeline can be extended in the future to support students and teachers in additional ways, for example, designing pedagogical agents that interact directly with students, developing intervention and reflection tools for teachers, helping teachers craft daily lesson plans and aiding teachers and administrators in designing curricula.more » « less
-
Abstract This paper provides an experience report on a co‐design approach with teachers to co‐create learning analytics‐based technology to support problem‐based learning in middle school science classrooms. We have mapped out a workflow for such applications and developed design narratives to investigate the implementation, modifications and temporal roles of the participants in the design process. Our results provide precedent knowledge on co‐designing with experienced and novice teachers and co‐constructing actionable insight that can help teachers engage more effectively with their students' learning and problem‐solving processes during classroom PBL implementations. Practitioner notesWhat is already known about this topicSuccess of educational technology depends in large part on the technology's alignment with teachers' goals for their students, teaching strategies and classroom context.Teacher and researcher co‐design of educational technology and supporting curricula has proven to be an effective way for integrating teacher insight and supporting their implementation needs.Co‐designing learning analytics and support technologies with teachers is difficult due to differences in design and development goals, workplace norms, and AI‐literacy and learning analytics background of teachers.What this paper addsWe provide a co‐design workflow for middle school teachers that centres on co‐designing and developing actionable insights to support problem‐based learning (PBL) by systematic development of responsive teaching practices using AI‐generated learning analytics.We adapt established human‐computer interaction (HCI) methods to tackle the complex task of classroom PBL implementation, working with experienced and novice teachers to create a learning analytics dashboard for a PBL curriculum.We demonstrate researcher and teacher roles and needs in ensuring co‐design collaboration and the co‐construction of actionable insight to support middle school PBL.Implications for practice and/or policyLearning analytics researchers will be able to use the workflow as a tool to support their PBL co‐design processes.Learning analytics researchers will be able to apply adapted HCI methods for effective co‐design processes.Co‐design teams will be able to pre‐emptively prepare for the difficulties and needs of teachers when integrating middle school teacher feedback during the co‐design process in support of PBL technologies.more » « less
-
Mavrikis, M; Lalle, S; Azevedo, R; Biswas, G; Roll, I (Ed.)Exploratory learning environments (ELEs), such as simulation-based platforms and open-ended science curricula, promote hands-on exploration and problem-solving but make it difficult for teachers to gain timely insights into students' conceptual understanding. This paper presents LearnLens, a generative AI (GenAI)-enhanced teacher-facing dashboard designed to support problem-based instruction in middle school science. LearnLens processes students' open-ended responses from digital assessments to provide various insights, including sample responses, word clouds, bar charts, and AI-generated summaries. These features elucidate students' thinking, enabling teachers to adjust their instruction based on emerging patterns of understanding. The dashboard was informed by teacher input during professional development sessions and implemented within a middle school Earth science curriculum. We report insights from teacher interviews that highlight the dashboard's usability and potential to guide teachers' instruction in the classroom.more » « lessFree, publicly-accessible full text available July 26, 2026
-
Recently, there has been a surge in developing curricula and tools that integrate computing (C) into Science, Technology, Engineering, and Math (STEM) programs. These environments foster authentic problem-solving while facilitating students’ concurrent learning of STEM+C content. In our study, we analyzed students’ behaviors as they worked in pairs to create computational kinematics models of object motion. We derived a domain-specific metric from students’ collaborative dialogue that measured how they integrated science and computing concepts into their problem-solving tasks. Additionally, we computed social metrics such as equity and turn-taking based on the students’ dialogue. We identified and characterized students’ planning, enacting, monitoring, and reflecting behaviors as they worked together on their model construction tasks. This study in-vestigates the impact of students’ collaborative behaviors on their performance in STEM+C computational modeling tasks. By analyzing the relationships between group synergy, turn-taking, and equity measures with task performance, we provide insights into how these collaborative behaviors influence students’ ability to construct accurate models. Our findings underscore the importance of synergistic discourse for overall task success, particularly during the enactment, monitoring, and reflection phases. Conversely, variations in equity and turn-taking have a minimal impact on segment-level task performance.more » « lessFree, publicly-accessible full text available July 1, 2026
-
Zhai, X; Latif, E; Liu, N; Biswas, G; Yin, Y (Ed.)Collaborative dialogue offers rich insights into students’ learning and critical thinking, which is essential for personalizing pedagogical agent interactions in STEM+C settings. While large language models (LLMs) facilitate dynamic pedagogical interactions, hallucinations undermine confidence, trust, and instructional value. Retrieval-augmented generation (RAG) grounds LLM outputs in curated knowledge, but requires a clear semantic link between user input and a knowledge base, which is often weak in student dialogue. We propose log-contextualized RAG (LC-RAG), which enhances RAG retrieval by using the environment logs to contextualize collaborative discourse. Our findings show that LCRAG improves retrieval over a discourse-only baseline and allows our collaborative peer agent, Copa, to deliver relevant, personalized guidance that supports students’ critical thinking and epistemic decision-making in a collaborative computational modeling environment, C2STEM.more » « lessFree, publicly-accessible full text available June 17, 2026
-
Collaborative problem-solving (CPS) in STEM+C education involves cognitive coordination and emotional regulation during joint tasks. Prior research has examined discrete affective states in learning environments but less is known about how these emotions evolve over time and affect CPS behavior. This study investigates the temporal dynamics of five emotions—engagement, confusion, boredom, delight, and frustration—using Markov Chain analysis of data from high school pairs building computational models in the C2STEM environment. Emotional transitions aligned with cognitive processes, seen in interaction patterns like PLAY, ADJUST, and BUILD, to analyze affect during modeling. Results show that emotional trajectories closely relate to cognitive actions, including construction, simulation testing, and debugging. Transitions that maintained engagement linked to productive collaboration and stronger performance, while ongoing frustration and boredom indicated disengagement progress.more » « lessFree, publicly-accessible full text available June 10, 2026
-
This paper explores the design of two types of pedagogical agents—teaching and peer—in a collaborative STEM+C learning environment, C2STEM, where high school students learn physics (kinematics) and computing by building computational models that simulate the motion of objects. Through in-depth case study interviews with teachers and students, we identify role-based features for these agents to support collaborative learning in open-ended STEM+C learning environments. We propose twelve design principles—four for teaching agents, four for peer agents, and four shared by both—contributing to foundational guidelines for developing agents that enhance collaborative learning through computational modeling.more » « lessFree, publicly-accessible full text available June 10, 2026
-
This paper explores the use of large language models (LLMs) to score and explain short-answer assessments in K-12 science. While existing methods can score more structured math and computer science assessments, they often do not provide explanations for the scores. Our study focuses on employing GPT-4 for automated assessment in middle school Earth Science, combining few-shot and active learning with chain-of-thought reasoning. Using a human-in-the-loop approach, we successfully score and provide meaningful explanations for formative assessment responses. A systematic analysis of our method's pros and cons sheds light on the potential for human-in-the-loop techniques to enhance automated grading for open-ended science assessments.more » « less
-
This paper examines the processes middle school STEM teachers employ to interpret student learning and problem-solving activities during a problem-based learning unit and then design evidence-based lesson-plan customizations. Utilizing inductive and constant comparative analysis of teachers’ think-aloud data, we identify catalyzing links that support the transition from interpretation to enactment. We provide a contrasting case between an experienced and a novice teacher, and discuss how the results can inform STEM PBL professional development and teacher-support technology development.more » « less
An official website of the United States government
