skip to main content


Title: What influences students’ abilities to critically evaluate scientific investigations?
Critical thinking is the process by which people make decisions about what to trust and what to do. Many undergraduate courses, such as those in biology and physics, include critical thinking as an important learning goal. Assessing critical thinking, however, is non-trivial, with mixed recommendations for how to assess critical thinking as part of instruction. Here we evaluate the efficacy of assessment questions to probe students’ critical thinking skills in the context of biology and physics. We use two research-based standardized critical thinking instruments known as the Biology Lab Inventory of Critical Thinking in Ecology (Eco-BLIC) and Physics Lab Inventory of Critical Thinking (PLIC). These instruments provide experimental scenarios and pose questions asking students to evaluate what to trust and what to do regarding the quality of experimental designs and data. Using more than 3000 student responses from over 20 institutions, we sought to understand what features of the assessment questions elicit student critical thinking. Specifically, we investigated (a) how students critically evaluate aspects of research studies in biology and physics when they are individually evaluating one study at a time versus comparing and contrasting two and (b) whether individual evaluation questions are needed to encourage students to engage in critical thinking when comparing and contrasting. We found that students are more critical when making comparisons between two studies than when evaluating each study individually. Also, compare-and-contrast questions are sufficient for eliciting critical thinking, with students providing similar answers regardless of if the individual evaluation questions are included. This research offers new insight on the types of assessment questions that elicit critical thinking at the introductory undergraduate level; specifically, we recommend instructors incorporate more compare-and-contrast questions related to experimental design in their courses and assessments.  more » « less
Award ID(s):
1909602
NSF-PAR ID:
10414236
Author(s) / Creator(s):
; ; ; ;
Editor(s):
Pamucar, Dragan
Date Published:
Journal Name:
PLOS ONE
Volume:
17
Issue:
8
ISSN:
1932-6203
Page Range / eLocation ID:
e0273337
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract

    Critical thinking, which can be defined as the evidence‐based ways in which people decide what to trust and what to do, is an important competency included in many undergraduate science, technology, engineering, and mathematics (STEM) courses. To help instructors effectively measure critical thinking, we developed the Biology Lab Inventory of Critical Thinking in Ecology (Eco‐BLIC), a freely available, closed‐response assessment of undergraduate students' critical thinking in ecology. The Eco‐BLIC includes ecology‐based experimental scenarios followed by questions that measure how students decide on what to trust and what to do next. Here, we present the development of the Eco‐BLIC using tests of validity and reliability. Using student responses to questions and think‐aloud interviews, we demonstrate the effectiveness of the Eco‐BLIC at measuring students' critical thinking skills. We find that while students generally think like experts while evaluating what to trust, students' responses are less expert‐like when deciding on what to do next.

     
    more » « less
  2. There is a critical need for more students with engineering and computer science majors to enter into, persist in, and graduate from four-year postsecondary institutions. Increasing the diversity of the workforce by inclusive practices in engineering and science is also a profound identified need. According to national statistics, the largest groups of underrepresented minority students in engineering and science attend U.S. public higher education institutions. Most often, a large proportion of these students come to colleges and universities with unique challenges and needs, and are more likely to be first in their family to attend college. In response to these needs, engineering education researchers and practitioners have developed, implemented and assessed interventions to provide support and help students succeed in college, particularly in their first year. These interventions typically target relatively small cohorts of students and can be managed by a small number of faculty and staff. In this paper, we report on “work in progress” research in a large-scale, first-year engineering and computer science intervention program at a public, comprehensive university using multivariate comparative statistical approaches. Large-scale intervention programs are especially relevant to minority serving institutions that prepare growing numbers of students who are first in their family to attend college and who are also under-resourced, financially. These students most often encounter academic difficulties and come to higher education with challenging experiences and backgrounds. Our studied first-year intervention program, first piloted in 2015, is now in its 5th year of implementation. Its intervention components include: (a) first-year block schedules, (b) project-based introductory engineering and computer science courses, (c) an introduction to mechanics course, which provides students with the foundation needed to succeed in a traditional physics sequence, and (d) peer-led supplemental instruction workshops for calculus, physics and chemistry courses. This intervention study responds to three research questions: (1) What role does the first-year intervention’s components play in students’ persistence in engineering and computer science majors across undergraduate program years? (2) What role do particular pedagogical and cocurricular support structures play in students’ successes? And (3) What role do various student socio-demographic and experiential factors play in the effectiveness of first-year interventions? To address these research questions and therefore determine the formative impact of the firstyear engineering and computer science program on which we are conducting research, we have collected diverse student data including grade point averages, concept inventory scores, and data from a multi-dimensional questionnaire that measures students’ use of support practices across their four to five years in their degree program, and diverse background information necessary to determine the impact of such factors on students’ persistence to degree. Background data includes students’ experiences prior to enrolling in college, their socio-demographic characteristics, and their college social capital throughout their higher education experience. For this research, we compared students who were enrolled in the first-year intervention program to those who were not enrolled in the first-year intervention. We have engaged in cross-sectional 2 data collection from students’ freshman through senior years and employed multivariate statistical analytical techniques on the collected student data. Results of these analyses were interesting and diverse. Generally, in terms of backgrounds, our research indicates that students’ parental education is positively related to their success in engineering and computer science across program years. Likewise, longitudinally (across program years), students’ college social capital predicted their academic success and persistence to degree. With regard to the study’s comparative research of the first-year intervention, our results indicate that students who were enrolled in the first-year intervention program as freshmen continued to use more support practices to assist them in academic success across their degree matriculation compared to students who were not in the first-year program. This suggests that the students continued to recognize the value of such supports as a consequence of having supports required as first-year students. In terms of students’ understanding of scientific or engineering-focused concepts, we found significant impact resulting from student support practices that were academically focused. We also found that enrolling in the first-year intervention was a significant predictor of the time that students spent preparing for classes and ultimately their grade point average, especially in STEM subjects across students’ years in college. In summary, we found that the studied first-year intervention program has longitudinal, positive impacts on students’ success as they navigate through their undergraduate experiences toward engineering and computer science degrees. 
    more » « less
  3. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  4. null (Ed.)
    Research-based assessments (RBAs; e.g., the Force Concept Inventory) that measure student content knowledge, attitudes, or identities have played a major role in transforming physics teaching practices. RBAs offer instructors a standardized method for empirically investigating the efficacy of their instructional practices and documenting the impacts of course transformations. Unlike course exams, the common usage of standardized RBAs across institutions uniquely supports instructors to compare their student outcomes over time or against multi-institutional data sets. While the number of RBAs and RBA-using instructors has increased over the last three decades, barriers to administering RBAs keep many physics instructors from using them.1,2 To mitigate these barriers, we have created full-service online RBA platforms (i.e., the Learning About STEM Student Outcomes [LASSO],3 Colorado Learning Attitudes About Science Survey for Experimental Physics [E-CLASS],4 and Physics Lab Inventory of Critical thinking [PLIC]5 platforms) that host, administer, score, and analyze RBAs. These web-based platforms can make it easier for instructors to use RBAs, especially as many courses have been forced to transition to online instruction. We hope that this editorial can serve as a guide for instructors considering administering RBAs online. In what follows, we examine common barriers to using RBAs, how online administration can remove those barriers, and the research into online administration of RBAs. In the supplementary material,6 we also include a practical how-to for administering RBAs online and sample student email wording. 
    more » « less
  5. Boone, E. ; Thuecks, S. (Ed.)
    Recent calls for increased inclusion in & access to authentic course-based research have been building on the momentum of support for Course-Based Undergraduate Research Experiences (CUREs). However, these courses can be very challenging to implement at scale or with low resources. To equitably provide these critical science process skills to the largest possible cohort of students, we have developed a new student research project within our first-year biology lab. Our student team research project is integrated throughout the semester, building authentic science process skills from start to finish. Students start from a research idea, develop a multi-site experimental design, do hands-on data collection at home, analyze quantitative data, and present their findings in a conference-style format. We have also embedded structured time for building collaborative skills. This novel change to our lab curriculum runs online, hybrid or face-to-face; it has no lab budget costs; and it has been well-received in multiple offerings of our course of ~200-600 students. It also has allowed us to improve our assessments: we evaluate writing (graphical abstracts) and/or oral presentation skills. Further, our lab exam can now be more cognitively challenging because our new curriculum better prepares students to analyze, evaluate, and synthesize. This article demonstrates that we can reduce barriers to doing authentic research, at scale in introductory courses; and we include here all materials needed to adapt this project to your own context. 
    more » « less