Abstract In this study, we used Epistemic Network Analysis (ENA) to represent data generated by Natural Language Processing (NLP) analytics during an activity based on the Knowledge Integration (KI) framework. The activity features a web-based adaptive dialog about energy transfer in photosynthesis and cellular respiration. Students write an initial explanation, respond to two adaptive prompts in the dialog, and write a revised explanation. The NLP models score the KI level of the initial and revised explanations. They also detect the ideas in the explanations and the dialog responses. The dialog uses the detected ideas to prompt students to elaborate and refine their explanations. Participants were 196 8th-grade students at a public school in the Western United States. We used ENA to represent the idea networks at each KI score level for the revised explanations. We also used ENA to analyze the idea trajectories for the initial explanation, the two dialog responses, and the final explanation. Higher KI levels were associated with more links and increased frequency of mechanistic ideas in ENA representations. Representation of the trajectories suggests that the NLP adaptive dialog helped students who started with descriptive and macroscopic ideas to add more microscopic ideas. The dialog also helped students who started with partially linked ideas to keep linking the microscopic ideas to mechanistic ideas. We discuss implications for STEM teachers and researchers who are interested in how students build on their ideas to integrate their ideas.
more »
« less
Designing an Adaptive Dialogue to Promote Science Understanding
We used Natural Language Processing (NLP) to design an adaptive computer dialogue that engages students in a conversation to reflect on and revise their written explanations of a science dilemma. We study the accuracy of the NLP idea detection. We analyze how 98 12-13 year-olds interacted with the dialogue as a part of a Diagnostic Inventory. We study students’ initial and revised science explanations along with their logged responses to the dialogue. The dialogue led to a high rate of student revision compared to prior studies of adaptive guidance. The adaptive prompt encouraged students to reflect on prior experiences, to consider new variables, and to raise scientific questions. Students incorporated these new ideas when revising their initial explanations. We discuss how these adaptive dialogues can strengthen science instruction.
more »
« less
- Award ID(s):
- 2101669
- PAR ID:
- 10330141
- Date Published:
- Journal Name:
- Proceedings of the 16th International Conference of the Learning Sciences - ICLS 2022
- Page Range / eLocation ID:
- 1653-1656
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Blikstein, P; Van_Aalst, J; Kizito, R; Brennan, K (Ed.)We explored how Natural Language Processing (NLP) adaptive dialogs that are designed following Knowledge Integration (KI) pedagogy elicit rich student ideas about thermodynamics and contribute to productive revision. We analyzed how 619 6-8th graders interacted with two rounds of adaptive dialog on an end-of-year inventory. The adaptive dialog significantly improved students’ KI levels. Their revised explanations are more integrated across all grades, genders, and prior thermodynamics experiences. The dialog elicited many additional ideas, including normative ideas and vague reasoning. In the first round, students refined their explanation to focus on their normative ideas. In the second round they began to elaborate their reasoning and add new normative ideas. Students added more mechanistic ideas about conductivity, equilibrium, and the distinction between how an object feels and its temperature after the dialog. Thus, adaptive dialogs are a promising tool for scaffolding science sense-making.more » « less
-
Natural language processing (NLP) tools can score students’ written explanations, opening new opportunities for science education. Optimally, these scores offer designers opportunities to align guidance with tested pedagogical frameworks and to investigate alternative ways to personalize instruction. We report on research, informed by the knowledge integration (KI) pedagogical framework, using online authorable and customizable environments (ACEs), to promote a deep understanding of complex scientific topics. We study how to personalize guidance to enable students to make productive revisions to written explanations during instruction, where they conduct investigations with models, simulations, hands-on activities, and other materials. We describe how we iteratively refined our assessments and guidance to support students in revising their scientific explanations. We report on recent investigations of hybrid models of personalized guidance that combine NLP scoring with opportunities for teachers to continue the conversation.more » « less
-
Blikstein, P; Van_Aalst, J; Kizito, R; Brennan, K (Ed.)This study takes advantage of advances in Natural Language Processing (NLP) to build an idea detection model that can identify ideas grounded in students’ linguistic experiences. We designed adaptive, interactive dialogs for four explanation items using the NLP idea detection model and investigated whether they similarly support students from distinct language backgrounds. The curriculum, assessments, and scoring rubrics were informed by the Knowledge Integration (KI) pedagogy. We analyzed responses of 1,036 students of different language backgrounds taught by 10 teachers in five schools in the western United States. The adaptive dialog engages students from both monolingual English and multilingual backgrounds in incorporating additional relevant ideas into their explanations, resulting in a significant improvement in student responses from initial to revised explanations. The guidance supports students in both language groups to progress in integrating their scientific ideas.more » « less
-
Examining the effect of automated assessments and feedback on students’ written science explanationsWriting scientific explanations is a core practice in science. However, students find it difficult to write coherent scientific explanations. Additionally, teachers find it challenging to provide real-time feedback on students’ essays. In this study, we discuss how PyrEval, an NLP technology, was used to automatically assess students’ essays and provide feedback. We found that students explained more key ideas in their essays after the automated assessment and feedback. However, there were issues with the automated assessments as well as students’ understanding of the feedback and revising their essays.more » « less