skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Student Metacognitive Reflection on a Conceptual Statics Question
Many engineering problems assigned in undergraduate classes are numerical and can be solved using equations and algorithms—for example, truss problems in statics are often solved using the method of joints or the method of sections. Concept questions, which can be administered in class using active learning pedagogies, aid in the development of conceptual understanding as opposed to the procedural skill often emphasized in numerical problems. We administered a concept question about a truss to 241 statics students at six diverse institutions and find no statistically significant differences in answer correctness or confidence between institutions. Across institutions, students report that they are not accustomed to such non-numerical concept questions, but they grapple in different ways with the experience. Some frame engineering as inherently numerical, and thus do not value the conceptual understanding assessed by the question, while others recognize that developing conceptual knowledge is useful and will translate to their future engineering work.  more » « less
Award ID(s):
2135190
PAR ID:
10444683
Author(s) / Creator(s):
;
Date Published:
Journal Name:
2023 ASEE Annual Conference & Exposition
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This paper presents the design and analysis of a pilot problem set deployed to engineering students to assess their retention of physics knowledge at the start of a statics course. The problem set was developed using the NSF-IUSE (grant #2315492) Learning Map project (LMap) and piloted in the spring and fall of 2024. The LMap process is rooted in the Analysis, Design, Development, Implementation, and Evaluation (ADDIE) model [1] and Backward Design [2,3], extending these principles to course sequences to align learning outcomes, assessments, and instructional practices. The primary motivation for this problem set (Statics Knowledge Inventory, SKI) was to evaluate students' understanding and retention of physics concepts at the beginning of a statics course. The SKI includes a combination of multiple-choice questions (MCQ) and procedural problems, filling a gap in widely-used concept inventories for physics and statics, such as the Force Concept Inventory (FCI) and Statics Concept Inventory (SCI), which evaluate learning gains within a course, rather than knowledge retention across courses. Using the LMap analysis and instructor consultations, we identified overlapping concepts and topics between Physics and Statics courses, referred to here as “interdependent learning outcomes” (ILOs). The problem set includes 15 questions—eight MCQs and seven procedural problems. Unlike most concept inventories, procedural problems were added to provide insight into students’ problem-solving approach and conceptual understanding. These problems require students to perform calculations, demonstrate their work, and assess their conceptual understanding of key topics, and allow the instructors to assess essential prerequisite skills like drawing free-body diagrams (FBDs), computing forces and moments, and performing basic vector calculation and unit conversions. Problems were selected and adapted from physics and statics textbooks, supplemented by instructor-designed questions to ensure full coverage of the ILOs. We used the revised 2D Bloom’s Taxonomy [4] and a 3D representation of it [5] to classify each problem within a 6x4 matrix (six cognitive processes x four knowledge dimensions). This classification provided instructors and students with a clear understanding of the cognitive level required for each problem. Additionally, we measured students’ perceived confidence and difficulty in each problem using two questions on a 3-point Likert scale. The first iteration of the problem set was administered to 19 students in the spring 2024 statics course. After analyzing their performance, we identified areas for improvement and revised the problem set, removing repetitive MCQs and restructuring the procedural problems into scaffolded, multi-part questions with associated rubrics for evaluation. The revised version, consisting of five MCQs and six procedural problems, was deployed to 136 students in the fall 2024 statics course. A randomly selected subset of student answers from the second iteration was graded and analyzed to compare with the first. This analysis will inform future efforts to evaluate knowledge retention and transfer in key skills across sequential courses. In collaboration with research teams developing concept inventories for mechanics courses, we aim to integrate these procedural problems into future inventories. 
    more » « less
  2. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  3. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  4. It is challenging to effectively educate in large classes with students from a multitude of backgrounds. Many introductory engineering courses in universities have hundreds of students, and some online classes are even larger. Instructors in these circumstances often turn to online homework systems, which help greatly reduce the grading burden; however, they come at the cost of reducing the quality of feedback that students receive. Since online systems typically can only automatically grade multiple choice or numeric answer questions, students predominately do not receive feedback on the critical skill of sketching free-body diagrams (FBD). An online, sketch-recognition based tutoring system called Mechanix requires students to draw free-body diagrams for introductory statics courses in addition to grading their final answers. Students receive feedback about their diagrams that would otherwise be difficult for instructors to provide in large classes. Additionally, Mechanix can grade open-ended truss design problems with an indeterminate number of solutions. Mechanix has been in use for over six semesters at five different universities by over 1000 students to study its effectiveness. Students used Mechanix for one to three homework assignments covering free-body diagrams, static truss analysis, and truss design for an open-ended problem. Preliminary results suggest the system increases homework engagement and effort for students who are struggling and is as effective as other homework systems for teaching statics. Focus groups showed students enjoyed using Mechanix and that it helped their learning process. 
    more » « less
  5. It is challenging to effectively educate in large classes with students from a multitude of backgrounds. Many introductory engineering courses in universities have hundreds of students, and some online classes are even larger. Instructors in these circumstances often turn to online homework systems, which help greatly reduce the grading burden; however, they come at the cost of reducing the quality of feedback that students receive. Since online systems typically can only automatically grade multiple choice or numeric answer questions, students predominately do not receive feedback on the critical skill of sketching free-body diagrams (FBD). An online, sketch-recognition based tutoring system called Mechanix requires students to draw free-body diagrams for introductory statics courses in addition to grading their final answers. Students receive feedback about their diagrams that would otherwise be difficult for instructors to provide in large classes. Additionally, Mechanix can grade open-ended truss design problems with an indeterminate number of solutions. Mechanix has been in use for over six semesters at five different universities by over 1000 students to study its effectiveness. Students used Mechanix for one to three homework assignments covering free-body diagrams, static truss analysis, and truss design for an open-ended problem. Preliminary results suggest the system increases homework engagement and effort for students who are struggling and is as effective as other homework systems for teaching statics. Focus groups showed students enjoyed using Mechanix and that it helped their learning process. 
    more » « less