This project supports the success of undergraduate engineering students through coordinated design of curricula across STEM course sequences. The Analysis, Design, Development, Implementation, Evaluation (ADDIE) framework and backward design are being used to develop guides for instructors to align learning outcomes, assessments, and instructional materials in a physics – engineering mechanics course sequence. The approach relies on the analysis of student learning outcomes in each course, identification of interdependent learning outcomes, and development of skills hierarchies in the form of visual learning maps. The learning maps are used to illustrate the knowledge required and built upon throughout the course sequence. This study will assess the effectiveness of a course redesign intervention, which uses visual learning maps and backward design concepts, to guide instructors within a common course sequence to align learning outcomes and assessments. If successful, the intervention is expected to improve students’ primary learning and knowledge retention, as well as persistence and success in the degree. The study will compare academic performance among Mechanical Engineering B.S., Environmental Engineering B.S., and Civil Engineering B.S. students who begin a Physics for Engineers – Statics – Dynamics course prior to the intervention (control) and after the intervention (treatment). During control and treatment terms, students’ primary learning in individual courses will be assessed using established concept inventories. Retention of knowledge from pre-requisite courses will be tracked using pre-identified problem sets (quizzes, exams) specifically associated with interdependent learning outcomes in the Statics and Dynamics courses. Students’ primary learning and knowledge retention in the sequence will be related to longer term student success outcomes, including retention and graduation. The poster will show the results of the research team’s first year of work, including an analysis of current course materials, learning maps for each course, identification of interdependent learning outcomes, example guiding materials and templates for instructors, and preliminary student performance data from the control cohort. 
                        more » 
                        « less   
                    This content will become publicly available on June 1, 2026
                            
                            Using Learning Maps and Bloom’s Taxonomy to Develop a New Instrument to Assess Knowledge Transfer from Physics to Statics Courses.
                        
                    
    
            This paper presents the design and analysis of a pilot problem set deployed to engineering students to assess their retention of physics knowledge at the start of a statics course. The problem set was developed using the NSF-IUSE (grant #2315492) Learning Map project (LMap) and piloted in the spring and fall of 2024. The LMap process is rooted in the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model [1] and Backward Design [2,3], extending these principles to course sequences to align learning outcomes, assessments, and instructional practices. The primary motivation for this problem set (Statics Knowledge Inventory, SKI) was to evaluate students' understanding and retention of physics concepts at the beginning of a statics course. The SKI includes a combination of multiple-choice questions (MCQ) and procedural problems, filling a gap in widely-used concept inventories for physics and statics, such as the Force Concept Inventory (FCI) and Statics Concept Inventory (SCI), which evaluate learning gains within a course, rather than knowledge retention across courses. Using the LMap analysis and instructor consultations, we identified overlapping concepts and topics between Physics and Statics courses, referred to here as “interdependent learning outcomes” (ILOs). The problem set includes 15 questions—eight MCQs and seven procedural problems. Unlike most concept inventories, procedural problems were added to provide insight into students’ problem-solving approach and conceptual understanding. These problems require students to perform calculations, demonstrate their work, and assess their conceptual understanding of key topics, and allow the instructors to assess essential prerequisite skills like drawing free-body diagrams (FBDs), computing forces and moments, and performing basic vector calculation and unit conversions. Problems were selected and adapted from physics and statics textbooks, supplemented by instructor-designed questions to ensure full coverage of the ILOs. We used the revised 2D Bloom’s Taxonomy [4] and a 3D representation of it [5] to classify each problem within a 6x4 matrix (six cognitive processes x four knowledge dimensions). This classification provided instructors and students with a clear understanding of the cognitive level required for each problem. Additionally, we measured students’ perceived confidence and difficulty in each problem using two questions on a 3-point Likert scale. The first iteration of the problem set was administered to 19 students in the spring 2024 statics course. After analyzing their performance, we identified areas for improvement and revised the problem set, removing repetitive MCQs and restructuring the procedural problems into scaffolded, multi-part questions with associated rubrics for evaluation. The revised version, consisting of five MCQs and six procedural problems, was deployed to 136 students in the fall 2024 statics course. A randomly selected subset of student answers from the second iteration was graded and analyzed to compare with the first. This analysis will inform future efforts to evaluate knowledge retention and transfer in key skills across sequential courses. In collaboration with research teams developing concept inventories for mechanics courses, we aim to integrate these procedural problems into future inventories. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 2315492
- PAR ID:
- 10635877
- Publisher / Repository:
- ASEE Conferences
- Date Published:
- Format(s):
- Medium: X
- Location:
- Montreal, Quebec, Canada
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Students achieve functional knowledge retention through active, spaced repetition of concepts through homework, quizzes, and lectures. True knowledge retention is best achieved through proper comprehension of the concept. In the engineering curriculum, courses are sequenced into prerequisite chains of three to five courses per subfield –- a design aimed at developing and reinforcing core concepts over time. Knowledge retention of these prerequisite concepts is important for the next course. In this project, concept review quizzes were used to identify the gaps and deficiencies in students' prerequisite knowledge and measure improvement after a concept review intervention. Two quizzes (pre-intervention and post-intervention) drew inspiration from the standard concept inventories for fundamental concepts and include concepts such as Free Body Diagrams, Contact and Reaction Forces, Equilibrium Equations, and Calculation of the Moment. Concept inventories are typically multiple-choice, in this evaluation the concept questions were open-ended. A clear rubric was created to identify the missing prerequisite concepts in the students' knowledge. These quizzes were deployed in Mechanics of Materials, a second-level course in the engineering mechanics curriculum (the second in a sequence of four courses: Statics, Mechanics of Materials, Mechanical Design, and Kinematic Design). The pre-quiz was administered (unannounced) at the beginning of the class. The class then actively participated in a 30-minute concept review. A different post-quiz was administered in the same class period after the review. Quizzes were graded with a rubric to measure the effect of the concept review intervention on the students’ knowledge demonstration and calculations. The study evaluated four major concepts: free body diagrams, boundary reaction forces (fixed, pin, and contact), equilibrium, and moment calculation. Students showed improvements of up to 39\% in the case of drawing a free body diagram with fixed boundary condition, but continued to struggle with free body diagram involving contact forces. This study was performed at a large public institution in a class size of 240 students. A total of 224 students consented to the use of their data for this study (and attended class on the day of the intervention). The pre-quiz is used to determine the gaps (or deficiencies) in conceptual understanding among students. The post-quiz measures the response to the review and is used to determine which concept deficiencies were significantly improved by the review, and which concept deficiencies were not significantly improved by the concept review. This study presents a concept quiz and associated rubric for measuring student improvement resulting from an in-class intervention (concept review). It quantifies a significant improvement in the students’ retrieval of their prerequisite knowledge after a concept review session. This approach, therefore, has utility for improving knowledge retention in programs with a similar, sequenced course design.more » « less
- 
            This study investigates how Learning Assistants (LAs) and related course features are associated with inequities in student learning in introductory university physics courses. 2,868 physics students’ paired pre- and post-test scores on concept inventories from 67 classes in 16 LA Alliance member institutions are examined in this investigation. The concept inventories included the Force Concept Inventory, Force and Motion Conceptual Evaluation, and the Conceptual Survey of Electricity and Magnetism. Our analyses include a multiple linear regression model that examines the impact of student (e.g. gender and race) and course level variables (e.g. presence of LAs and Concept Inventory used) on student learning outcomes (Cohen’s d effect size) across classroom contexts. The presence of LAs was found to either remove or invert the traditional learning gaps between students from dominant and non-dominant populations. Significant differences in student performance were also found across the concept inventories.more » « less
- 
            null (Ed.)We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories---conceptual tests of understanding---that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background.more » « less
- 
            null; null; null; null (Ed.)We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
