skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


This content will become publicly available on April 24, 2026

Title: MathFlowLens Dashboard: Co‐Designing Teacher Orchestration Tools to Engage in Discourse Around Students' Mathematical Strategies
ABSTRACT BackgroundEducational technologies typically provide teachers with analytics regarding student proficiency, but few digital tools provide teachers with process‐based information about students' variable problem‐solving strategies as they solve problems. Utilising design thinking and co‐designing with teachers can provide insight to researchers about what educators need to make instructional decisions based on student problem‐solving data. ObjectivesThis case study presents a collaboration where researchers and teachers co‐designed MathFlowLens, a teacher‐facing dashboard that provides analytics and visualisations about students' diverse problem‐solving strategies and behaviours used when solving online math problems in the classroom. MethodsOver several sessions, teachers discussed, mocked up, and were provided with behavioural data and strategy visualisations from students' math problem‐solving that demonstrated the variability of strategic approaches. Throughout this process, the team documented, transcribed, and used these conversations and artefacts to inform the design and development of the teacher tool. Results and ConclusionsTeachers discussed and designed prototypes of data dashboards and provided the research team with ongoing feedback to inform the iteration of the tool development.  more » « less
Award ID(s):
2142984
PAR ID:
10618116
Author(s) / Creator(s):
 ;  ;  ;  
Publisher / Repository:
Wiley-Blackwell
Date Published:
Journal Name:
Journal of Computer Assisted Learning
Volume:
41
Issue:
3
ISSN:
0266-4909
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract This paper provides an experience report on a co‐design approach with teachers to co‐create learning analytics‐based technology to support problem‐based learning in middle school science classrooms. We have mapped out a workflow for such applications and developed design narratives to investigate the implementation, modifications and temporal roles of the participants in the design process. Our results provide precedent knowledge on co‐designing with experienced and novice teachers and co‐constructing actionable insight that can help teachers engage more effectively with their students' learning and problem‐solving processes during classroom PBL implementations. Practitioner notesWhat is already known about this topicSuccess of educational technology depends in large part on the technology's alignment with teachers' goals for their students, teaching strategies and classroom context.Teacher and researcher co‐design of educational technology and supporting curricula has proven to be an effective way for integrating teacher insight and supporting their implementation needs.Co‐designing learning analytics and support technologies with teachers is difficult due to differences in design and development goals, workplace norms, and AI‐literacy and learning analytics background of teachers.What this paper addsWe provide a co‐design workflow for middle school teachers that centres on co‐designing and developing actionable insights to support problem‐based learning (PBL) by systematic development of responsive teaching practices using AI‐generated learning analytics.We adapt established human‐computer interaction (HCI) methods to tackle the complex task of classroom PBL implementation, working with experienced and novice teachers to create a learning analytics dashboard for a PBL curriculum.We demonstrate researcher and teacher roles and needs in ensuring co‐design collaboration and the co‐construction of actionable insight to support middle school PBL.Implications for practice and/or policyLearning analytics researchers will be able to use the workflow as a tool to support their PBL co‐design processes.Learning analytics researchers will be able to apply adapted HCI methods for effective co‐design processes.Co‐design teams will be able to pre‐emptively prepare for the difficulties and needs of teachers when integrating middle school teacher feedback during the co‐design process in support of PBL technologies. 
    more » « less
  2. Abstract BackgroundReal‐world engineering problems are ill‐defined and complex, and solving them may arouse negative epistemic affect (feelings experienced within problem‐solving). These feelings fall into sequenced patterns (affective pathways). Over time, these patterns can alter students' attitudes toward engineering. Meta‐affect (affect or cognition about affect) can shape or reframe affective pathways, changing a student's problem‐solving experience. Purpose/Hypothesis(es)This paper examines epistemic affect and meta‐affect in undergraduate students solving ill‐defined problems called open‐ended modeling problems (OEMPs), addressing two research questions: What epistemic affect and transitions between different affective states do students report? And, how does meta‐affect shape students' affective experiences? Design/MethodWe examined 11 retrospective interviews with nine students performed across two semesters in which students completed OEMPs. Using inductive and deductive coding with discourse analysis, we systematically searched for expressions conveying epistemic affect and for transitions in affect; we performed additional deductive coding of the transcripts for meta‐affect and synthesized these results to formulate narratives related to affect and meta‐affect. ResultsTogether, the expressions, transitions, and meta‐affect suggest different types of student experiences. Depending on their meta‐affect, students either recounted experiences dominated by positive or negative affect, or else they experienced negative emotions as productive. ConclusionsIll‐defined complex problems elicit a wide range of positive and negative emotions and provide opportunities to practice affective regulation and productive meta‐affect. Viewing the OEMPs as authentic disciplinary experiences and/or the ability to view negative emotions as productive can enable overall positive experiences. Our results provide insight into how instructors can foster positive affective pathways through problem‐scaffolding or their interactions with students. 
    more » « less
  3. AbstractRecent advances in generative artificial intelligence (AI) and multimodal learning analytics (MMLA) have allowed for new and creative ways of leveraging AI to support K12 students' collaborative learning in STEM+C domains. To date, there is little evidence of AI methods supporting students' collaboration in complex, open‐ended environments. AI systems are known to underperform humans in (1) interpreting students' emotions in learning contexts, (2) grasping the nuances of social interactions and (3) understanding domain‐specific information that was not well‐represented in the training data. As such, combined human and AI (ie, hybrid) approaches are needed to overcome the current limitations of AI systems. In this paper, we take a first step towards investigating how a human‐AI collaboration between teachers and researchers using an AI‐generated multimodal timeline can guide and support teachers' feedback while addressing students' STEM+C difficulties as they work collaboratively to build computational models and solve problems. In doing so, we present a framework characterizing the human component of our human‐AI partnership as a collaboration between teachers and researchers. To evaluate our approach, we present our timeline to a high school teacher and discuss the key insights gleaned from our discussions. Our case study analysis reveals the effectiveness of an iterative approach to using human‐AI collaboration to address students' STEM+C challenges: the teacher can use the AI‐generated timeline to guide formative feedback for students, and the researchers can leverage the teacher's feedback to help improve the multimodal timeline. Additionally, we characterize our findings with respect to two events of interest to the teacher: (1) when the students cross adifficulty threshold,and (2) thepoint of intervention, that is, when the teacher (or system) should intervene to provide effective feedback. It is important to note that the teacher explained that there should be a lag between (1) and (2) to give students a chance to resolve their own difficulties. Typically, such a lag is not implemented in computer‐based learning environments that provide feedback. Practitioner notesWhat is already known about this topicCollaborative, open‐ended learning environments enhance students' STEM+C conceptual understanding and practice, but they introduce additional complexities when students learn concepts spanning multiple domains.Recent advances in generative AI and MMLA allow for integrating multiple datastreams to derive holistic views of students' states, which can support more informed feedback mechanisms to address students' difficulties in complex STEM+C environments.Hybrid human‐AI approaches can help address collaborating students' STEM+C difficulties by combining the domain knowledge, emotional intelligence and social awareness of human experts with the general knowledge and efficiency of AI.What this paper addsWe extend a previous human‐AI collaboration framework using a hybrid intelligence approach to characterize the human component of the partnership as a researcher‐teacher partnership and present our approach as a teacher‐researcher‐AI collaboration.We adapt an AI‐generated multimodal timeline to actualize our human‐AI collaboration by pairing the timeline with videos of students encountering difficulties, engaging in active discussions with a high school teacher while watching the videos to discern the timeline's utility in the classroom.From our discussions with the teacher, we define two types ofinflection pointsto address students' STEM+C difficulties—thedifficulty thresholdand theintervention point—and discuss how thefeedback latency intervalseparating them can inform educator interventions.We discuss two ways in which our teacher‐researcher‐AI collaboration can help teachers support students encountering STEM+C difficulties: (1) teachers using the multimodal timeline to guide feedback for students, and (2) researchers using teachers' input to iteratively refine the multimodal timeline.Implications for practice and/or policyOur case study suggests that timeline gaps (ie, disengaged behaviour identified by off‐screen students, pauses in discourse and lulls in environment actions) are particularly important for identifying inflection points and formulating formative feedback.Human‐AI collaboration exists on a dynamic spectrum and requires varying degrees of human control and AI automation depending on the context of the learning task and students' work in the environment.Our analysis of this human‐AI collaboration using a multimodal timeline can be extended in the future to support students and teachers in additional ways, for example, designing pedagogical agents that interact directly with students, developing intervention and reflection tools for teachers, helping teachers craft daily lesson plans and aiding teachers and administrators in designing curricula. 
    more » « less
  4. Developers rarely build programming environments that help secondary teachers support student learning. We interviewed 11 K12 teachers to discover how they support students learning to program and how tools might assist their teaching practice. Based on thematic analysis and organizing teacher activities around student actions, we have derived a new framework that can be used to design a programming learning system to support teachers. Our results suggest that teachers structure their activities based on their ideals about effective programming teaching and learning, and student problem solving and help-seeking processes. Therefore, our framework relates the themes we discovered about teacher activities to ideals and student problem solving in a time-based framework that can inform the design for new programming learning systems. 
    more » « less
  5. Abstract BackgroundTeachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students. ObjectivesIn this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback. MethodsWe build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students. Results and ConclusionWe find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach. 
    more » « less