Abstract BackgroundIn college science laboratory and discussion sections, student-centered active learning strategies have been implemented to improve student learning outcomes and experiences. Research has shown that active learning activities can increase student anxiety if students fear that they could be negatively evaluated by their peers. Error framing (i.e., to frame errors as natural and beneficial to learning) is proposed in the literature as a pedagogical tool to reduce student anxiety. However, little research empirically explores how an instructor can operationalize error framing and how error framing is perceived by undergraduate students. To bridge the gap in the literature, we conducted a two-stage study that involved science graduate teaching assistants (GTAs) and undergraduate students. In stage one, we introduced cold calling (i.e., calling on non-volunteering students) and error framing to 12 chemistry and 11 physics GTAs. Cold calling can increase student participation but may increase student anxiety. Error framing has the potential to mitigate student anxiety when paired with cold calling. GTAs were then tasked to rehearse cold calling paired with error framing in a mixed-reality classroom simulator. We identified GTA statements that aligned with the definition of error framing. In stage two, we selected a few example GTA error framing statements and interviewed 13 undergraduate students about their perception of those statements. ResultsIn the simulator, all the GTAs rehearsed cold calling multiple times while only a few GTAs made error framing statements. A thematic analysis of GTAs’ error framing statements identified ways of error indication (i.e., explicit and implicit) and framing (i.e., natural, beneficial, and positive acknowledgement). Undergraduate student interviews revealed specific framing and tone that are perceived as increasing or decreasing student comfort in participating in classroom discourse. Both undergraduate students and some GTAs expressed negative opinions toward responses that explicitly indicate student mistakes. Undergraduate students’ perspectives also suggest that error framing should be implemented differently depending on whether errors have already occurred. ConclusionError framing is challenging for science GTAs to implement. GTAs’ operationalizations of error framing in the simulator and undergraduate students’ perceptions contribute to defining and operationalizing error framing for instructional practice. To increase undergraduate student comfort in science classroom discourse, GTAs can use implicit error indication. In response to students’ incorrect answers, GTAs can positively frame students’ specific ideas rather than discussing broadly how errors are natural or beneficial.
more »
« less
Individual variation in undergraduate student metacognitive monitoring and error detection during biology model evaluation
IntroductionModels are a primary mode of science communication and preparing university students to evaluate models will allow students to better construct models and predict phenomena. Model evaluation relies on students’ subject-specific knowledge, perception of model characteristics, and confidence in their knowledge structures. MethodsFifty first-year college biology students evaluated models of concepts from varying biology subject areas with and without intentionally introduced errors. Students responded with ‘error’ or ‘no error’ and ‘confident’ or ‘not confident’ in their response. ResultsOverall, students accurately evaluated 65% of models and were confident in 67% of their responses. Students were more likely to respond accurately when models were drawn or schematic (as opposed to a box-and-arrow format), when models had no intentional errors, and when they expressed confidence. Subject area did not affect the accuracy of responses. DiscussionVariation in response patterns to specific models reflects variation in model evaluation abilities and suggests ways that pedagogy can support student metacognitive monitoring during model-based reasoning. Error detection is a necessary step towards modeling competence that will facilitate student evaluation of scientific models and support their transition from novice to expert scientists.
more »
« less
- Award ID(s):
- 2000549
- PAR ID:
- 10545868
- Publisher / Repository:
- Frontiers in Education
- Date Published:
- Journal Name:
- Frontiers in Education
- Volume:
- 9
- ISSN:
- 2504-284X
- Subject(s) / Keyword(s):
- model-based learning modeling conceptual models metacognition confidence
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Abstract BackgroundTeachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students. ObjectivesIn this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback. MethodsWe build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students. Results and ConclusionWe find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach.more » « less
-
BackgroundIncreasingly, college science courses are transitioning from a traditional lecture format to active learning because students learn more and fail less frequently when they engage in their learning through activities and discussions in class. Fear of negative evaluation (FNE), defined as a student’s sense of dread associated with being unfavorably evaluated while participating in a social situation, discourages undergraduates from participating in small group discussions, whole class discussions, and conversing one-on-one with instructors. ObjectiveThis study aims to evaluate the acceptability of a novel digital single-session intervention and to assess the feasibility of implementing it in a large enrollment college science course taught in an active learning way. MethodsTo equip undergraduates with skills to cope with FNE and bolster their confidence, clinical psychologists and biology education researchers developed Project Engage, a digital, self-guided single-session intervention for college students. It teaches students strategies for coping with FNE to bolster their confidence. Project Engage provides biologically informed psychoeducation, uses interactive elements for engagement, and helps generate a personalized action plan. We conducted a 2-armed randomized controlled trial to evaluate the acceptability and the preliminary effectiveness of Project Engage compared with an active control condition that provides information on available resources on the college campus. ResultsIn a study of 282 upper-level physiology students, participants randomized to complete Project Engage reported a greater increase in overall confidence in engaging in small group discussions (P=.01) and whole class discussions (P<.001), but not in one-on-one interactions with instructors (P=.05), from baseline to immediately after intervention outcomes, compared with participants in an active control condition. Project Engage received a good acceptability rating (1.22 on a scale of –2 to +2) and had a high completion rate (>97%). ConclusionsThis study provides a foundation for a freely available, easily accessible intervention to bolster student confidence for contributing in class. Trial RegistrationOSF Registries osf.io/4ca68 http://osf.io/4ca68more » « less
-
Roll, I; McNamara, D; Sosnovsky, S; Luckin, R; Dimitrova, V. (Ed.)Knowledge tracing refers to a family of methods that estimate each student’s knowledge component/skill mastery level from their past responses to questions. One key limitation of most existing knowledge tracing methods is that they can only estimate an overall knowledge level of a student per knowledge component/skill since they analyze only the (usually binary-valued) correctness of student responses. Therefore, it is hard to use them to diagnose specific student errors. In this paper, we extend existing knowledge tracing methods beyond correctness prediction to the task of predicting the exact option students select in multiple choice questions. We quantitatively evaluate the performance of our option tracing methods on two large-scale student response datasets. We also qualitatively evaluate their ability in identifying common student errors in the form of clusters of incorrect options across different questions that correspond to the same error.more » « less
-
Hughes, Lee (Ed.)IntroductionThe benefits of actively engaging students is especially relevant for teaching undergraduate students about evolutionary processes and content. Examining eco-immunological data can help students overcome the naïve conception that humans are not evolving or affected by evolutionary pressures. MethodsHere, we used graphical reasoning in two evolution courses (small/honors and large/regular) to teach students about eco-immunology in humans and non-human organisms during a unit on the evolution of life-history traits. The module challenged students to (i) distinguish between immunological and evolutionary fitness, (ii) evaluate graphical data from the primary scientific literature on energy allocation and trade-offs, and (iii) integrate these proximate and ultimate processes into a more wholistic understanding of on-going human evolution. Student performance and perceptions were measured through closed and open response items. Open response items were thematically analyzed to identify salient themes. ResultsStudent performance in the large class increased significantly on items related to fitness, energy trade-offs, and graphical reasoning, while student performance in the small class increased just for items related to energy trade-offs. Student confidence in graphical reasoning, perceptions of the importance of graphical reasoning, and perceptions of the value of interdisciplinary research was high for both classes. Student narrative examples regarding confidence, perceptions of graphical reasoning, and perceptions of interdisciplinary research are presented. DiscussionWe conclude that students can increase their performance and perceptions of eco-immunology and graphical reasoning through an active learning, graph reading module. Furthermore, students can be introduced to the field of immunology through their evolution courses.more » « less
An official website of the United States government

