skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Individual variation in undergraduate student metacognitive monitoring and error detection during biology model evaluation
IntroductionModels are a primary mode of science communication and preparing university students to evaluate models will allow students to better construct models and predict phenomena. Model evaluation relies on students’ subject-specific knowledge, perception of model characteristics, and confidence in their knowledge structures. MethodsFifty first-year college biology students evaluated models of concepts from varying biology subject areas with and without intentionally introduced errors. Students responded with ‘error’ or ‘no error’ and ‘confident’ or ‘not confident’ in their response. ResultsOverall, students accurately evaluated 65% of models and were confident in 67% of their responses. Students were more likely to respond accurately when models were drawn or schematic (as opposed to a box-and-arrow format), when models had no intentional errors, and when they expressed confidence. Subject area did not affect the accuracy of responses. DiscussionVariation in response patterns to specific models reflects variation in model evaluation abilities and suggests ways that pedagogy can support student metacognitive monitoring during model-based reasoning. Error detection is a necessary step towards modeling competence that will facilitate student evaluation of scientific models and support their transition from novice to expert scientists.  more » « less
Award ID(s):
2000549
PAR ID:
10545868
Author(s) / Creator(s):
; ; ; ; ;
Publisher / Repository:
Frontiers in Education
Date Published:
Journal Name:
Frontiers in Education
Volume:
9
ISSN:
2504-284X
Subject(s) / Keyword(s):
model-based learning modeling conceptual models metacognition confidence
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract BackgroundIn college science laboratory and discussion sections, student-centered active learning strategies have been implemented to improve student learning outcomes and experiences. Research has shown that active learning activities can increase student anxiety if students fear that they could be negatively evaluated by their peers. Error framing (i.e., to frame errors as natural and beneficial to learning) is proposed in the literature as a pedagogical tool to reduce student anxiety. However, little research empirically explores how an instructor can operationalize error framing and how error framing is perceived by undergraduate students. To bridge the gap in the literature, we conducted a two-stage study that involved science graduate teaching assistants (GTAs) and undergraduate students. In stage one, we introduced cold calling (i.e., calling on non-volunteering students) and error framing to 12 chemistry and 11 physics GTAs. Cold calling can increase student participation but may increase student anxiety. Error framing has the potential to mitigate student anxiety when paired with cold calling. GTAs were then tasked to rehearse cold calling paired with error framing in a mixed-reality classroom simulator. We identified GTA statements that aligned with the definition of error framing. In stage two, we selected a few example GTA error framing statements and interviewed 13 undergraduate students about their perception of those statements. ResultsIn the simulator, all the GTAs rehearsed cold calling multiple times while only a few GTAs made error framing statements. A thematic analysis of GTAs’ error framing statements identified ways of error indication (i.e., explicit and implicit) and framing (i.e., natural, beneficial, and positive acknowledgement). Undergraduate student interviews revealed specific framing and tone that are perceived as increasing or decreasing student comfort in participating in classroom discourse. Both undergraduate students and some GTAs expressed negative opinions toward responses that explicitly indicate student mistakes. Undergraduate students’ perspectives also suggest that error framing should be implemented differently depending on whether errors have already occurred. ConclusionError framing is challenging for science GTAs to implement. GTAs’ operationalizations of error framing in the simulator and undergraduate students’ perceptions contribute to defining and operationalizing error framing for instructional practice. To increase undergraduate student comfort in science classroom discourse, GTAs can use implicit error indication. In response to students’ incorrect answers, GTAs can positively frame students’ specific ideas rather than discussing broadly how errors are natural or beneficial. 
    more » « less
  2. Abstract BackgroundTeachers often rely on the use of open‐ended questions to assess students' conceptual understanding of assigned content. Particularly in the context of mathematics; teachers use these types of questions to gain insight into the processes and strategies adopted by students in solving mathematical problems beyond what is possible through more close‐ended problem types. While these types of problems are valuable to teachers, the variation in student responses to these questions makes it difficult, and time‐consuming, to evaluate and provide directed feedback. It is a well‐studied concept that feedback, both in terms of a numeric score but more importantly in the form of teacher‐authored comments, can help guide students as to how to improve, leading to increased learning. It is for this reason that teachers need better support not only for assessing students' work but also in providing meaningful and directed feedback to students. ObjectivesIn this paper, we seek to develop, evaluate, and examine machine learning models that support automated open response assessment and feedback. MethodsWe build upon the prior research in the automatic assessment of student responses to open‐ended problems and introduce a novel approach that leverages student log data combined with machine learning and natural language processing methods. Utilizing sentence‐level semantic representations of student responses to open‐ended questions, we propose a collaborative filtering‐based approach to both predict student scores as well as recommend appropriate feedback messages for teachers to send to their students. Results and ConclusionWe find that our method outperforms previously published benchmarks across three different metrics for the task of predicting student performance. Through an error analysis, we identify several areas where future works may be able to improve upon our approach. 
    more » « less
  3. BackgroundIncreasingly, college science courses are transitioning from a traditional lecture format to active learning because students learn more and fail less frequently when they engage in their learning through activities and discussions in class. Fear of negative evaluation (FNE), defined as a student’s sense of dread associated with being unfavorably evaluated while participating in a social situation, discourages undergraduates from participating in small group discussions, whole class discussions, and conversing one-on-one with instructors. ObjectiveThis study aims to evaluate the acceptability of a novel digital single-session intervention and to assess the feasibility of implementing it in a large enrollment college science course taught in an active learning way. MethodsTo equip undergraduates with skills to cope with FNE and bolster their confidence, clinical psychologists and biology education researchers developed Project Engage, a digital, self-guided single-session intervention for college students. It teaches students strategies for coping with FNE to bolster their confidence. Project Engage provides biologically informed psychoeducation, uses interactive elements for engagement, and helps generate a personalized action plan. We conducted a 2-armed randomized controlled trial to evaluate the acceptability and the preliminary effectiveness of Project Engage compared with an active control condition that provides information on available resources on the college campus. ResultsIn a study of 282 upper-level physiology students, participants randomized to complete Project Engage reported a greater increase in overall confidence in engaging in small group discussions (P=.01) and whole class discussions (P<.001), but not in one-on-one interactions with instructors (P=.05), from baseline to immediately after intervention outcomes, compared with participants in an active control condition. Project Engage received a good acceptability rating (1.22 on a scale of –2 to +2) and had a high completion rate (>97%). ConclusionsThis study provides a foundation for a freely available, easily accessible intervention to bolster student confidence for contributing in class. Trial RegistrationOSF Registries osf.io/4ca68 http://osf.io/4ca68 
    more » « less
  4. Abstract BackgroundMetacognitive processes have been linked to the development of conceptual knowledge in STEM courses, but previous work has centered on the regulatory aspects of metacognition. PurposeWe interrogated the relationship between epistemic metacognition and conceptual knowledge in engineering statics courses across six universities by asking students a difficult concept question with concurrent reflection prompts that elicited their metacognitive thinking. MethodWe used a mixed‐methods design containing an embedded phase followed by an explanatory phase. This design allowed us to both prompt and measure student epistemic metacognition within the learning context. The embedded phase consisted of quantitative and qualitative analyses of student responses. The explanatory phase consisted of an analysis of six instructor interviews. ResultsAnalysis of 267 student responses showed greater variation in students' epistemic metacognition than in their ability to answer correctly. Students used different kinds of epistemic metacognitive resources about the nature and origin of knowledge, epistemological forms, epistemological activities, and stances toward knowledge. These resources generally assembled into one of two frames: aconstructed knowledge framingvaluing conceptual knowledge and sense‐making, and anauthoritative knowledge framingforegrounding numerical, algorithmic problem‐solving. All six instructors interviewed described resources that align with both frames, and none explicitly considered student epistemic metacognition. ConclusionsInstructors' explicit attention to epistemic metacognition can potentially shift students to more productive frames for engineering learning. Findings here also inform two broader issues in STEM instruction: student resistance to active learning, and the direct instruction versus inquiry‐based learning debate. 
    more » « less
  5. Wright, L Kate (Ed.)
    ABSTRACT Quantitative reasoning is a critical skill in biology and has been highlighted as a core competency byVision and Change. Despite its importance, students often struggle to apply mathematical skills in new contexts in biology, a process called transfer of knowledge. However, the supports and barriers that students perceive for this process remain unclear. To explore this further, we interviewed undergraduate students in an introductory biology lab course about how they understand and report the transfer of quantitative skills in these courses. We then applied these themes to the Step Back, Translate, and Extend (SBTE) framework to examine student perceptions of the supports and barriers to their knowledge transfer. Students reported different supports and barriers at each level of the transfer process. At the first step of the framework, the recognition level, students reported reflecting on previous chemistry, statistics, and physics learning as helpful cues to indicate a transfer opportunity. Others, however, reported perceiving math and science as separate subjects without overlap, causing a disconnect in their recognition of transferable knowledge. In the second level of the framework, students recall previous learning. Students reported repetition and positive dispositions toward science and math as supportive factors. In contrast, gaps of time between initial learning and new contexts and negative dispositions hindered recall ability. The final level of the SBTE framework focuses on application. Students reported being better able to apply previous learning to new contexts in the biology lab when they could relate their applied skills to “real-world” applications, external motivating factors, and future career goals. These students also reported proactively seeking outside resources to fill gaps in their understanding. Generating data in a lab setting was also mentioned by students as both a supportive factor of application when they felt confident in their answers and a hindrance to application when they felt unsure about its accuracy. 
    more » « less