skip to main content


Title: Development and validation of a scientific (formal) reasoning test for college students
Abstract

We present a multiple‐choice test, the Montana State University Formal Reasoning Test (FORT), to assess college students' scientific reasoning ability. The test defines scientific reasoning to be equivalent to formal operational reasoning. It contains 20 questions divided evenly among five types of problems: control of variables, hypothesis testing, correlational reasoning, proportional reasoning, and probability. The test development process included the drafting and psychometric analysis of 23 instruments related to formal operational reasoning. These instruments were administered to almost 10,000 students enrolled in introductory science courses at American universities. Questions with high discrimination were identified and assembled into an instrument that was intended to measure the reasoning ability of students across the entire spectrum of abilities in college science courses. We present four types of validity evidence for the FORT. (a) The test has a one‐dimensional psychometric structure consistent with its design. (b) Test scores in an introductory biology course had an empirical reliability of 0.82. (c) Student interviews confirmed responses to the FORT were accurate indications of student thinking. (d) A regression analysis of student learning in an introductory biology course showed that scores on the FORT predicted how well students learned one of the most challenging concepts in biology, natural selection.

 
more » « less
NSF-PAR ID:
10459510
Author(s) / Creator(s):
 ;  
Publisher / Repository:
Wiley Blackwell (John Wiley & Sons)
Date Published:
Journal Name:
Journal of Research in Science Teaching
Volume:
56
Issue:
9
ISSN:
0022-4308
Page Range / eLocation ID:
p. 1269-1284
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. There is a critical need for more students with engineering and computer science majors to enter into, persist in, and graduate from four-year postsecondary institutions. Increasing the diversity of the workforce by inclusive practices in engineering and science is also a profound identified need. According to national statistics, the largest groups of underrepresented minority students in engineering and science attend U.S. public higher education institutions. Most often, a large proportion of these students come to colleges and universities with unique challenges and needs, and are more likely to be first in their family to attend college. In response to these needs, engineering education researchers and practitioners have developed, implemented and assessed interventions to provide support and help students succeed in college, particularly in their first year. These interventions typically target relatively small cohorts of students and can be managed by a small number of faculty and staff. In this paper, we report on “work in progress” research in a large-scale, first-year engineering and computer science intervention program at a public, comprehensive university using multivariate comparative statistical approaches. Large-scale intervention programs are especially relevant to minority serving institutions that prepare growing numbers of students who are first in their family to attend college and who are also under-resourced, financially. These students most often encounter academic difficulties and come to higher education with challenging experiences and backgrounds. Our studied first-year intervention program, first piloted in 2015, is now in its 5th year of implementation. Its intervention components include: (a) first-year block schedules, (b) project-based introductory engineering and computer science courses, (c) an introduction to mechanics course, which provides students with the foundation needed to succeed in a traditional physics sequence, and (d) peer-led supplemental instruction workshops for calculus, physics and chemistry courses. This intervention study responds to three research questions: (1) What role does the first-year intervention’s components play in students’ persistence in engineering and computer science majors across undergraduate program years? (2) What role do particular pedagogical and cocurricular support structures play in students’ successes? And (3) What role do various student socio-demographic and experiential factors play in the effectiveness of first-year interventions? To address these research questions and therefore determine the formative impact of the firstyear engineering and computer science program on which we are conducting research, we have collected diverse student data including grade point averages, concept inventory scores, and data from a multi-dimensional questionnaire that measures students’ use of support practices across their four to five years in their degree program, and diverse background information necessary to determine the impact of such factors on students’ persistence to degree. Background data includes students’ experiences prior to enrolling in college, their socio-demographic characteristics, and their college social capital throughout their higher education experience. For this research, we compared students who were enrolled in the first-year intervention program to those who were not enrolled in the first-year intervention. We have engaged in cross-sectional 2 data collection from students’ freshman through senior years and employed multivariate statistical analytical techniques on the collected student data. Results of these analyses were interesting and diverse. Generally, in terms of backgrounds, our research indicates that students’ parental education is positively related to their success in engineering and computer science across program years. Likewise, longitudinally (across program years), students’ college social capital predicted their academic success and persistence to degree. With regard to the study’s comparative research of the first-year intervention, our results indicate that students who were enrolled in the first-year intervention program as freshmen continued to use more support practices to assist them in academic success across their degree matriculation compared to students who were not in the first-year program. This suggests that the students continued to recognize the value of such supports as a consequence of having supports required as first-year students. In terms of students’ understanding of scientific or engineering-focused concepts, we found significant impact resulting from student support practices that were academically focused. We also found that enrolling in the first-year intervention was a significant predictor of the time that students spent preparing for classes and ultimately their grade point average, especially in STEM subjects across students’ years in college. In summary, we found that the studied first-year intervention program has longitudinal, positive impacts on students’ success as they navigate through their undergraduate experiences toward engineering and computer science degrees. 
    more » « less
  2. Open-ended laboratory projects increase student success and retention in the sciences. However, developing organismal-based research projects is a challenge for students with restricted laboratory access, such as those attending courses remotely. Here I describe the use of image analysis of zebrafish neural development for authentic research projects in an introductory biology laboratory course. Zebrafish are a vertebrate model that produce large numbers of externally and rapidly developing embryos. Because zebrafish larvae are transparent, fluorescent reporters marking nervous system structures can be imaged over time and analyzed by undergraduate scientists. In the pilot of this project, remote first-year college students independently developed biological questions based on an image collection comparing zebrafish mutants and wild-type siblings. Students created and mastered techniques to analyze position, organization, and other morphological features of developing neurons and glia in the images to directly test their biological questions. At the end of the course, students communicated their project results in journal article format and oral presentations. Students were able to hone skills in organismal observation and data collection while studying remotely, and they reported excitement at applying lecture-based knowledge to their own independent questions. This module can be adapted by other instructors for both students on- and off-campus to teach principles of neural development, data collection, data analysis, and scientific communication. 
    more » « less
  3. ABSTRACT Traditional lecture-centered approaches alone are inadequate for preparing students for the challenges of creative problem solving in the STEM disciplines. As an alternative, learnercentered and other high-impact pedagogies are gaining prominence. The Wabash College 3D Printing and Fabrication Center (3D-PFC) supports several initiatives on campus, but one of the most successful is a computer-aided design (CAD) and fabrication-based undergraduate research internship program. The first cohort of four students participated in an eight-week program during the summer of 2015. A second group of the four students was successfully recruited to participate the following summer. This intensive materials science research experience challenged students to employ digital design and fabrication in the design, testing, and construction of inexpensive scientific instrumentation for use in introductory STEM courses at Wabash College. The student research interns ultimately produced a variety of successful new designs that could be produced for less than $25 per device and successfully detect analytes of interest down to concentrations in the parts per million (ppm) range. These student-produced instruments have enabled innovations in the way introductory instrumental analysis is taught on campus. Beyond summer work, the 3D-PFC staffed student interns during the academic year, where they collaborated on various cross-disciplinary projects with students and faculty from departments such as mathematics, physics, biology, rhetoric, history, classics, and English. Thus far, the student work has led to three campus presentations, four presentations at national professional conferences, and three peer-reviewed publications. The following report highlights initial progress as well as preliminary assessment findings. 
    more » « less
  4. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  5. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less