skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Leveraging complexity frameworks to refine theories of engagement: Advancing self‐regulated learning in the age of artificial intelligence
Abstract Capturing evidence for dynamic changes in self‐regulated learning (SRL) behaviours resulting from interventions is challenging for researchers. In the current study, we identified students who were likely to do poorly in a biology course and those who were likely to do well. Then, we randomly assigned a portion of the students predicted to perform poorly to a science of learning to learn intervention where they were taught SRL study strategies. Learning outcome and log data (257 K events) were collected fromn = 226 students. We used a complex systems framework to model the differences in SRL including the amount, interrelatedness, density and regularity of engagement captured in digital trace data (ie, logs). Differences were compared between students who were predicted to (1) perform poorly (control,n = 48), (2) perform poorly and received intervention (treatment,n = 95) and (3) perform well (not flagged,n = 83). Results indicated that the regularity of students' engagement was predictive of course grade, and that the intervention group exhibited increased regularity in engagement over the control group immediately after the intervention and maintained that increase over the course of the semester. We discuss the implications of these findings in relation to the future of artificial intelligence and potential uses for monitoring student learning in online environments. Practitioner notesWhat is already known about this topicSelf‐regulated learning (SRL) knowledge and skills are strong predictors of postsecondary STEM student success.SRL is a dynamic, temporal process that leads to purposeful student engagement.Methods and metrics for measuring dynamic SRL behaviours in learning contexts are needed.What this paper addsA Markov process for measuring dynamic SRL processes using log data.Evidence that dynamic, interaction‐dominant aspects of SRL predict student achievement.Evidence that SRL processes can be meaningfully impacted through educational intervention.Implications for theory and practiceComplexity approaches inform theory and measurement of dynamic SRL processes.Static representations of dynamic SRL processes are promising learning analytics metrics.Engineered features of LMS usage are valuable contributions to AI models.  more » « less
Award ID(s):
1821601
PAR ID:
10419812
Author(s) / Creator(s):
 ;  ;  
Publisher / Repository:
Wiley-Blackwell
Date Published:
Journal Name:
British Journal of Educational Technology
Volume:
54
Issue:
5
ISSN:
0007-1013
Page Range / eLocation ID:
p. 1204-1221
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. With increasing interest in computer‐assisted educa‐ tion, AI‐integrated systems become highly applicable with their ability to adapt based on user interactions. In this context, this paper focuses on understanding and analysing first‐year undergraduate student responses to an intelligent educational system that applies multi‐agent reinforcement learning as an AI tutor. With human–computer interaction at the centre, we discuss principles of interface design and educational gamification in the context of multiple years of student observations, student feedback surveys and focus group interviews. We show positive feedback from the design methodology we discuss as well as the overall process of providing automated tutoring in a gamified virtual environment. We also discuss students' thinking in the context of gamified educational systems, as well as unexpected issues that may arise when implementing such systems. Ultimately, our design iterations and analysis both offer new insights for practical implementation of computer‐assisted educational systems, focusing on how AI can augment, rather than replace, human intelligence in the classroom. Practitioner notesWhat is already known about this topicAI‐integrated systems show promise for personalizing learning and improving student education.Existing research has shown the value of personalized learner feedback.Engaged students learn more effectively.What this paper addsStudent opinions of and responses to an HCI‐based personalized educational system.New insights for practical implementation of AI‐integrated educational systems informed by years of student observations and system improvements.Qualitative insights into system design to improve human–computer interaction in educational systems.Implications for practice and/or policyActionable design principles for computer‐assisted tutoring systems derived from first‐hand student feedback and observations.Encourage new directions for human–computer interaction in educational systems. 
    more » « less
  2. Abstract Preparing preservice teachers (PSTs) to be able to notice, interpret, respond to and orchestrate student ideas—the core practices of responsive teaching—is a key goal for contemporary science and mathematics teacher education. This mixed‐methods study, employing a virtual reality (VR)‐supported simulation integrated with artificial intelligence (AI)‐powered virtual students, explored the frequent patterns of PSTs' talk moves as they attempted to orchestrate a responsive discussion, as well as the affordances and challenges of leveraging AI‐supported virtual simulation to enhance PSTs' responsive teaching skills. Sequential analysis of the talk moves of both PSTs (n = 24) and virtual students indicated that although PSTs did employ responsive talk moves, they encountered difficulties in transitioning from the authoritative, teacher‐centred teaching approach to a responsive way of teaching. The qualitative analysis with triangulated dialogue transcripts, observational field notes and semi‐structured interviews revealed participants' engagement in (1) orchestrating discussion by leveraging the design features of AI‐supported simulation, (2) iterative rehearsals through naturalistic and contextualized interactions and (3) exploring realism and boundaries in AI‐powered virtual students. The study findings provide insights into the potential of leveraging AI‐supported virtual simulation to improve PSTs' responsive teaching skills. The study also underscores the need for PSTs to engage in well‐designed pedagogical practices with adaptive and in situ support. Practitioner notesWhat is already known about this topicDeveloping the teaching capacity of responsive teaching is an important goal for preservice teacher (PST) education. PSTs need systematic opportunities to build fluency in this approach.Virtual simulations can provide PSTs with the opportunities to practice interactive teaching and have been shown to improve their teaching skills.Artificial intelligence (AI)‐powered virtual students can be integrated into virtual simulations to enable interactive and authentic practice of teaching.What this paper addsAI‐supported simulation has the potential to support PSTs' responsive teaching skills.While PSTs enact responsive teaching talk moves, they struggle to enact those talk moves in challenging teaching scenarios due to limited epistemic and pedagogical resources.AI‐supported simulation affords iterative and contextualized opportunities for PSTs to practice responsive teaching talk moves; it challenges teachers to analyse student discourse and respond in real time.Implications for practice and/or policyPSTs should build a teaching repertoire with both basic and advanced responsive talk moves.The learning module should adapt to PSTs' prior experience and provide PSTs with in situ learning support to navigate challenging teaching scenarios.Integrating interaction features and AI‐based virtual students into the simulation can facilitate PSTs' active participation. 
    more » « less
  3. Abstract Much attention in constructionism has focused on designing tools and activities that support learners in designing fully finished and functional applications and artefacts to be shared with others. But helping students learn to debug their applications often takes on a surprisingly more instructionist stance by giving them checklists, teaching them strategies or providing them with test programmes. The idea of designing bugs for learning—ordebugging by design—makes learners agents of their own learning and, more importantly, of making and solving mistakes. In this paper, we report on our implementation of ‘Debugging by Design’ activities in a high school classroom over a period of 8 hours as part of an electronic textiles unit. Students were tasked to craft the electronic textile artefacts with problems or bugs for their peers to solve. Drawing on observations and interviews, we answer the following research questions: (1) How did students participate in making bugs for others? (2) What did students gain from designing and solving bugs for others? In the discussion, we address the opportunities and challenges that designing personally and socially meaningful failure artefacts provides for becoming objects‐to‐think‐with and objects‐to‐share‐with in student learning and promoting new directions in constructionism. Practitioner notesWhat is already known about this topicThere is substantial evidence for the benefits of learning programming and debugging in the context of constructing personally relevant and complex artefacts, including electronic textiles.Related, work on productive failure has demonstrated that providing learners with strategically difficult problems (in which they ‘fail’) equips them to better handle subsequent challenges.What this paper addsIn this paper, we argue that designing bugs or ‘failure artefacts’ is as much a constructionist approach to learning as is designing fully functional artefacts.We consider how ‘failure artefacts’ can be both objects‐to‐learn‐with and objects‐to‐share‐with.We introduce the concept of ‘Debugging by Design’ (DbD) as a means to expand application of constructionism to the context of developing ‘failure artifacts’.Implications for practice and/or policyWe conceptualise a new way to enable and empower students in debugging—by designing creative, multimodal buggy projects for others to solve.The DbD approach may support students in near‐transfer of debugging and the beginning of a more systematic approach to debugging in later projects and should be explored in other domains beyond e‐textiles.New studies should explore learning, design and teaching that empower students to design bugs in projects in mischievous and creative ways. 
    more » « less
  4. AbstractRecent advances in generative artificial intelligence (AI) and multimodal learning analytics (MMLA) have allowed for new and creative ways of leveraging AI to support K12 students' collaborative learning in STEM+C domains. To date, there is little evidence of AI methods supporting students' collaboration in complex, open‐ended environments. AI systems are known to underperform humans in (1) interpreting students' emotions in learning contexts, (2) grasping the nuances of social interactions and (3) understanding domain‐specific information that was not well‐represented in the training data. As such, combined human and AI (ie, hybrid) approaches are needed to overcome the current limitations of AI systems. In this paper, we take a first step towards investigating how a human‐AI collaboration between teachers and researchers using an AI‐generated multimodal timeline can guide and support teachers' feedback while addressing students' STEM+C difficulties as they work collaboratively to build computational models and solve problems. In doing so, we present a framework characterizing the human component of our human‐AI partnership as a collaboration between teachers and researchers. To evaluate our approach, we present our timeline to a high school teacher and discuss the key insights gleaned from our discussions. Our case study analysis reveals the effectiveness of an iterative approach to using human‐AI collaboration to address students' STEM+C challenges: the teacher can use the AI‐generated timeline to guide formative feedback for students, and the researchers can leverage the teacher's feedback to help improve the multimodal timeline. Additionally, we characterize our findings with respect to two events of interest to the teacher: (1) when the students cross adifficulty threshold,and (2) thepoint of intervention, that is, when the teacher (or system) should intervene to provide effective feedback. It is important to note that the teacher explained that there should be a lag between (1) and (2) to give students a chance to resolve their own difficulties. Typically, such a lag is not implemented in computer‐based learning environments that provide feedback. Practitioner notesWhat is already known about this topicCollaborative, open‐ended learning environments enhance students' STEM+C conceptual understanding and practice, but they introduce additional complexities when students learn concepts spanning multiple domains.Recent advances in generative AI and MMLA allow for integrating multiple datastreams to derive holistic views of students' states, which can support more informed feedback mechanisms to address students' difficulties in complex STEM+C environments.Hybrid human‐AI approaches can help address collaborating students' STEM+C difficulties by combining the domain knowledge, emotional intelligence and social awareness of human experts with the general knowledge and efficiency of AI.What this paper addsWe extend a previous human‐AI collaboration framework using a hybrid intelligence approach to characterize the human component of the partnership as a researcher‐teacher partnership and present our approach as a teacher‐researcher‐AI collaboration.We adapt an AI‐generated multimodal timeline to actualize our human‐AI collaboration by pairing the timeline with videos of students encountering difficulties, engaging in active discussions with a high school teacher while watching the videos to discern the timeline's utility in the classroom.From our discussions with the teacher, we define two types ofinflection pointsto address students' STEM+C difficulties—thedifficulty thresholdand theintervention point—and discuss how thefeedback latency intervalseparating them can inform educator interventions.We discuss two ways in which our teacher‐researcher‐AI collaboration can help teachers support students encountering STEM+C difficulties: (1) teachers using the multimodal timeline to guide feedback for students, and (2) researchers using teachers' input to iteratively refine the multimodal timeline.Implications for practice and/or policyOur case study suggests that timeline gaps (ie, disengaged behaviour identified by off‐screen students, pauses in discourse and lulls in environment actions) are particularly important for identifying inflection points and formulating formative feedback.Human‐AI collaboration exists on a dynamic spectrum and requires varying degrees of human control and AI automation depending on the context of the learning task and students' work in the environment.Our analysis of this human‐AI collaboration using a multimodal timeline can be extended in the future to support students and teachers in additional ways, for example, designing pedagogical agents that interact directly with students, developing intervention and reflection tools for teachers, helping teachers craft daily lesson plans and aiding teachers and administrators in designing curricula. 
    more » « less
  5. Abstract This paper provides an experience report on a co‐design approach with teachers to co‐create learning analytics‐based technology to support problem‐based learning in middle school science classrooms. We have mapped out a workflow for such applications and developed design narratives to investigate the implementation, modifications and temporal roles of the participants in the design process. Our results provide precedent knowledge on co‐designing with experienced and novice teachers and co‐constructing actionable insight that can help teachers engage more effectively with their students' learning and problem‐solving processes during classroom PBL implementations. Practitioner notesWhat is already known about this topicSuccess of educational technology depends in large part on the technology's alignment with teachers' goals for their students, teaching strategies and classroom context.Teacher and researcher co‐design of educational technology and supporting curricula has proven to be an effective way for integrating teacher insight and supporting their implementation needs.Co‐designing learning analytics and support technologies with teachers is difficult due to differences in design and development goals, workplace norms, and AI‐literacy and learning analytics background of teachers.What this paper addsWe provide a co‐design workflow for middle school teachers that centres on co‐designing and developing actionable insights to support problem‐based learning (PBL) by systematic development of responsive teaching practices using AI‐generated learning analytics.We adapt established human‐computer interaction (HCI) methods to tackle the complex task of classroom PBL implementation, working with experienced and novice teachers to create a learning analytics dashboard for a PBL curriculum.We demonstrate researcher and teacher roles and needs in ensuring co‐design collaboration and the co‐construction of actionable insight to support middle school PBL.Implications for practice and/or policyLearning analytics researchers will be able to use the workflow as a tool to support their PBL co‐design processes.Learning analytics researchers will be able to apply adapted HCI methods for effective co‐design processes.Co‐design teams will be able to pre‐emptively prepare for the difficulties and needs of teachers when integrating middle school teacher feedback during the co‐design process in support of PBL technologies. 
    more » « less