Abstract Critical thinking, which can be defined as the evidence‐based ways in which people decide what to trust and what to do, is an important competency included in many undergraduate science, technology, engineering, and mathematics (STEM) courses. To help instructors effectively measure critical thinking, we developed the Biology Lab Inventory of Critical Thinking in Ecology (Eco‐BLIC), a freely available, closed‐response assessment of undergraduate students' critical thinking in ecology. The Eco‐BLIC includes ecology‐based experimental scenarios followed by questions that measure how students decide on what to trust and what to do next. Here, we present the development of the Eco‐BLIC using tests of validity and reliability. Using student responses to questions and think‐aloud interviews, we demonstrate the effectiveness of the Eco‐BLIC at measuring students' critical thinking skills. We find that while students generally think like experts while evaluating what to trust, students' responses are less expert‐like when deciding on what to do next. 
                        more » 
                        « less   
                    
                            
                            What influences students’ abilities to critically evaluate scientific investigations?
                        
                    
    
            Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1909602
- PAR ID:
- 10414236
- Editor(s):
- Pamucar, Dragan
- Date Published:
- Journal Name:
- PLOS ONE
- Volume:
- 17
- Issue:
- 8
- ISSN:
- 1932-6203
- Page Range / eLocation ID:
- e0273337
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            This study investigated whether and how Learning Assistant (LA) support is linked to student outcomes in Physics courses nationwide. Paired student concept inventory scores were collected over three semesters from 3,753 students, representing 69 courses, and 40 instructors, from 17 LA Alliance member institutions. Each participating student completed an online concept inventory at the beginning (pre) and end (post) of each term. The physics concept inventories tested included the Force Concept Inventory (FCI), Conceptual Survey of Electricity and Magnetism (CSEM), Force and Motion Concept Evaluation (FMCE) and the Brief Electricity and Magnetism Assessment (BEMA). Across instruments, Cohen’s d effect sizes were 1.4 times higher, on average, for courses supported by LAs compared to courses without LA support. Preliminary findings indicate that physics students' outcomes may be most effective when LA support is utilized in laboratory settings (1.9 times higher than no LA support) in comparison to lecture (1.4 times higher), recitations (1.5 times higher), or unknown uses (1.3 times higher). Additional research will inform LA-implementation best practices across disciplines.more » « less
- 
            Learning standards for biology courses have called for increasing statistics content. Little is known, however, about biology students’ attitudes towards statistics content and what students actually learn about statistics in these courses. This study aims to uncover changes in attitudes and content knowledge in statistics for students in biology courses. One hundred thirty-four introductory biology students across five different instructors participated in a pre-post study of statistical thinking and attitudes toward statistics. Students performed better on the statistics conceptual inventory at the end of a biology course compared to the beginning. Student attitudes showed no change. These preliminary results suggest the potential importance for laying a conceptual foundation in statistics prior to taking biology courses with little formal statistical instruction.more » « less
- 
            With an increasing focus in STEM education on critical thinking skills, science writing plays an ever more important role. A recently published dataset of two sets of college level lab reports from an inquiry-based physics curriculum relies on analytic assessment rubrics that utilize multiple dimensions, specifying subject matter knowledge and general components of good explanations. Each analytic dimension is assessed on a 6-point scale, to provide detailed feedback to students that can help them improve their science writing skills. Manual assessment can be slow, and difficult to calibrate for consistency across all students in large enrollment courses with many sections. While much work exists on automated assessment of open-ended questions in STEM subjects, there has been far less work on long-form writing such as lab reports. We present an end-to-end neural architecture that has separate verifier and assessment modules, inspired by approaches to Open Domain Question Answering (OpenQA). VerAs first verifies whether a report contains any content relevant to a given rubric dimension, and if so, assesses the relevant sentences. On the lab reports, VerAs outperforms multiple baselines based on OpenQA systems or Automated Essay Scoring (AES). VerAs also performs well on an analytic rubric for middle school physics essays.more » « less
- 
            Wright, L. Kate (Ed.)ABSTRACT Undergraduate genetics courses have historically focused on simple genetic models, rather than taking a more multifactorial approach where students explore how traits are influenced by a combination of genes, the environment, and gene-by-environment interactions. While a focus on simple genetic models can provide straightforward examples to promote student learning, they do not match the current scientific understanding and can result in deterministic thinking among students. In addition, undergraduates are often interested in complex human traits that are influenced by the environment, and national curriculum standards include learning objectives that focus on multifactorial concepts. This research aims to discover to what extent multifactorial genetics is currently being assessed in undergraduate genetics courses. To address this, we analyzed over 1,000 assessment questions from a commonly used undergraduate genetics textbook; published concept assessments; and open-source, peer-reviewed curriculum materials. Our findings show that current genetics assessment questions overwhelmingly emphasize the impact of genes on phenotypes and that the effect of the environment is rarely addressed. These results indicate a need for the inclusion of more multifactorial genetics concepts, and we suggest ways to introduce them into undergraduate courses.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    