Abstract Critical thinking, which can be defined as the evidence‐based ways in which people decide what to trust and what to do, is an important competency included in many undergraduate science, technology, engineering, and mathematics (STEM) courses. To help instructors effectively measure critical thinking, we developed the Biology Lab Inventory of Critical Thinking in Ecology (Eco‐BLIC), a freely available, closed‐response assessment of undergraduate students' critical thinking in ecology. The Eco‐BLIC includes ecology‐based experimental scenarios followed by questions that measure how students decide on what to trust and what to do next. Here, we present the development of the Eco‐BLIC using tests of validity and reliability. Using student responses to questions and think‐aloud interviews, we demonstrate the effectiveness of the Eco‐BLIC at measuring students' critical thinking skills. We find that while students generally think like experts while evaluating what to trust, students' responses are less expert‐like when deciding on what to do next. 
                        more » 
                        « less   
                    This content will become publicly available on July 14, 2026
                            
                            Investigating dimensions of instructor trust using the words of undergraduate STEM students
                        
                    
    
            Recent work has shown that student trust in their instructor is a key moderator of STEM student buy-in to evidence-based teaching practices (EBTs), enhancing positive student outcomes such as performance, engagement, and persistence. Although trust in instructor has been previously operationalized in related settings, a systematic classification of how undergraduate STEM students perceive trustworthiness in their instructors remains to be developed. Moreover, previous operationalizations impose a structure that often includes distinct domains, such as cognitive and affective trust, that have yet to be empirically tested in the undergraduate STEM context. MethodsTo address this gap, we engage in a multi-step qualitative approach to unify existing definitions of trust from the literature and analyze structured interviews with 57 students enrolled in undergraduate STEM classes who were asked to describe a trusted instructor. Through thematic analysis, we propose that characteristics of a trustworthy instructor can be classified into three domains. We then assess the validity of the three-domain model both qualitatively and quantitatively. First, we examine student responses to determine how traits from different domains are mentioned together. Second, we use a process-model approach to instrument design that leverages our qualitative interview codebook to develop a survey that measures student trust. We performed an exploratory factor analysis on survey responses to quantitatively test the construct validity of our proposed three-domain trust model. Results and discussionWe identified 28 instructor traits that students perceived as trustworthy, categorized into cognitive, affective, and relational domains. Within student responses, we found that there was a high degree of interconnectedness between traits in the cognitive and relational domains. When we assessed the construct validity of the three-factor model using survey responses, we found that a three-factor model did not adequately capture the underlying latent structure. Our findings align with recent calls to both closely examine long-held assumptions of trust dimensionality and to develop context-specific trust measurements. The work presented here can inform the development of a reliable measure of student trust within undergraduate STEM student environments and ultimately improve our understanding of how instructors can best leverage the effectiveness of EBTs for positive student learning outcomes. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1323258
- PAR ID:
- 10621146
- Publisher / Repository:
- Frontiers in Education
- Date Published:
- Journal Name:
- Frontiers in Education
- Volume:
- 10
- ISSN:
- 2504-284X
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract BackgroundDespite well‐documented benefits, instructor adoption of active learning has been limited in engineering education. Studies have identified barriers to instructors’ adoption of active learning, but there is no well‐tested instrument to measure instructors perceptions of these barriers. PurposeWe developed and tested an instrument to measure instructors’ perceptions of barriers to adopting active learning and identify the constructs that coherently categorize those barriers. MethodWe used a five‐phase process to develop an instrument to measure instructors’ perceived barriers to adopting active learning. In Phase 1, we built upon the Faculty Instructional Barriers and Identity Survey (FIBIS) to create a draft instrument. In Phases 2 and 3, we conducted exploratory factor analysis (EFA) on an initial 45‐item instrument and a refined 21‐item instrument, respectively. We conducted confirmatory factor analysis (CFA) in Phases 4 and 5 to test the factor structure identified in Phases 2 and 3. ResultsOur final instrument consists of 17 items and four factors: (1) student preparation and engagement; (2) instructional support; (3) instructor comfort and confidence; and (4) institutional environment/rewards. Instructor responses indicated that time considerations do not emerge as a standalone factor. ConclusionsOur 17‐item instrument exhibits a sound factor structure and is reliable, enabling the assessment of perceived barriers to adopting active learning in different contexts. The four factors align with an existing model of instructional change in science, technology, engineering, and mathematics (STEM). Although time is a substantial instructor concern that did not comprise a standalone factor, it is closely related to multiple constructs in our final model.more » « less
- 
            Despite many studies confirming that active learning in STEM classrooms improves student outcomes, instructors’ adoption of active learning has been surprisingly slow. This work-in-progress paper describes our broader research study in which we compare the efficacy of a traditional active learning workshop (AL) and an extended version of this workshop that also specifically highlights instructor strategies to reduce resistance (AL+) on instructors’ beliefs about and actual adoption of active learning in undergraduate STEM classrooms. Through a randomized control trial (RCT), we aim to understand the ways in which these workshops influence instructors’ motivation to adopt and the actual use of active learning. This RCT involves instructors and students at a large number of institutions including two-year college, four-year college, and large research institutions in three regions of the country and strategies to reduce student resistance to active learning. We have developed and piloted three instruments, which allow for triangulation of classroom data: an instructor survey, a student survey, and a classroom observation protocol. This work-in-progress paper will cover the current progress of our research study and present our research instruments.more » « less
- 
            Despite many studies confirming that active learning in STEM classrooms improves student outcomes, instructors;' adoption of active learning has been surprisingly slow. This work-in-progress paper describes our broader research study in which we compare the efficacy of a traditional active learning workshop (AL) and an extended version of this workshop that also specifically highlights instructor strategies to reduce resistance (AL+) on instructors' beliefs about and actual adoption of active learning in undergraduate STEM classrooms. Through a randomized control trial (RCT), we aim to understand the ways in which these workshops influence instructors' motivation to adopt and the actual use of active learning. This RCT involves instructors and students at a large number of institutions including two-year college, four-year college, and large research institutions in three regions of the country and strategies to reduce student resistance to active learning. We have developed and piloted three instruments, which allow for triangulation of classroom data: an instructor survey, a student survey, and a classroom observation protocol. This work-in-progress paper will cover the current progress of our research study and present our research instruments.more » « less
- 
            Despite many studies confirming that active learning in STEM classrooms improves student outcomes, instructors’ adoption of active learning has been surprisingly slow. This work-in-progress paper describes our broader research study in which we compare the efficacy of a traditional active learning workshop (AL) and an extended version of this workshop that also specifically highlights instructor strategies to reduce resistance (AL+) on instructors’ beliefs about and actual adoption of active learning in undergraduate STEM classrooms. Through a randomized control trial (RCT), we aim to understand the ways in which these workshops influence instructors’ motivation to adopt and the actual use of active learning. This RCT involves instructors and students at a large number of institutions including two-year college, four-year college, and large research institutions in three regions of the country and strategies to reduce student resistance to active learning. We have developed and piloted three instruments, which allow for triangulation of classroom data: an instructor survey, a student survey, and a classroom observation protocol. This work-in-progress paper will cover the current progress of our research study and present our research instruments.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
