skip to main content


This content will become publicly available on June 1, 2024

Title: Student Metacognitive Reflection on a Conceptual Statics Question
Many engineering problems assigned in undergraduate classes are numerical and can be solved using equations and algorithms—for example, truss problems in statics are often solved using the method of joints or the method of sections. Concept questions, which can be administered in class using active learning pedagogies, aid in the development of conceptual understanding as opposed to the procedural skill often emphasized in numerical problems. We administered a concept question about a truss to 241 statics students at six diverse institutions and find no statistically significant differences in answer correctness or confidence between institutions. Across institutions, students report that they are not accustomed to such non-numerical concept questions, but they grapple in different ways with the experience. Some frame engineering as inherently numerical, and thus do not value the conceptual understanding assessed by the question, while others recognize that developing conceptual knowledge is useful and will translate to their future engineering work.  more » « less
Award ID(s):
2135190
NSF-PAR ID:
10444683
Author(s) / Creator(s):
;
Date Published:
Journal Name:
2023 ASEE Annual Conference & Exposition
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Abstract Background

    Conceptual understanding is critical to both engineering education and practice. In fact, many undergraduate courses focus on developing students' knowledge and understanding of core engineering concepts. At the same time, a growing body of literature points to substantial gaps across educational and professional practice contexts, including how problems are embodied and solved.

    Purpose

    The purpose of this study was to explore potential differences in conceptual understanding across engineering students and professional engineers. To do so, we compared the responses of civil engineering practitioners to the Statics Concept Inventory (SCI) to those of engineering students enrolled in statics courses.

    Design/Method

    We administered the SCI to 95 practicing civil engineers and compared their responses to an existing dataset from 1,372 engineering students. We conducted three comparisons: overall SCI score, concept subscores, and item‐by‐item.

    Results

    Students generally outperformed engineers on the SCI in terms of overall performance. However, on closer inspection, students' superior performance appears to be driven by differences in knowledge or understanding of specific statics concepts rather than a stronger understanding in general.

    Conclusions

    We caution against interpretations that imply students have a better understanding of statics concepts. Instead, our results suggest that differences in the way concepts are situated and applied across school and workplace contexts might account for the differences in the performance observed. These findings raise critical questions regarding the nature of concepts and the immutability of common academic representations, and point to the need for further qualitative and exploratory work investigating concepts in practice.

     
    more » « less
  2. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  3. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less
  4. Several consensus reports cite a critical need to dramatically increase the number and diversity of STEM graduates over the next decade. They conclude that a change to evidence-based instructional practices, such as concept-based active learning, is needed. Concept-based active learning involves the use of activity-based pedagogies whose primary objectives are to make students value deep conceptual understanding (instead of only factual knowledge) and then to facilitate their development of that understanding. Concept-based active learning has been shown to increase academic engagement and student achievement, to significantly improve student retention in academic programs, and to reduce the performance gap of underrepresented students. Fostering students' mastery of fundamental concepts is central to real world problem solving, including several elements of engineering practice. Unfortunately, simply proving that these instructional practices are more effective than traditional methods for promoting student learning, for increasing retention in academic programs, and for improving ability in professional practice is not enough to ensure widespread pedagogical change. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. In this project we seek to propagate the Concept Warehouse, a technological innovation designed to foster concept-based active learning, into Mechanical Engineering (ME) and to study student learning with this tool in five diverse institutional settings. The Concept Warehouse (CW) is a web-based instructional tool that we developed for Chemical Engineering (ChE) faculty. It houses over 3,500 ConcepTests, which are short questions that can rapidly be deployed to engage students in concept-oriented thinking and/or to assess students’ conceptual knowledge, along with more extensive concept-based active learning tools. The CW has grown rapidly during this project and now has over 1,600 faculty accounts and over 37,000 student users. New ConcepTests were created during the current reporting period; the current numbers of questions for Statics, Dynamics, and Mechanics of Materials are 342, 410, and 41, respectively. A detailed review process is in progress, and will continue through the no-cost extension year, to refine question clarity and to identify types of new questions to fill gaps in content coverage. There have been 497 new faculty accounts created after June 30, 2018, and 3,035 unique students have answered these mechanics questions in the CW. We continue to analyze instructor interviews, focusing on 11 cases, all of whom participated in the CW Community of Practice (CoP). For six participants, we were able to compare use of the CW both before and after participating in professional development activities (workshops and/or a community or practice). Interview results have been coded and are currently being analyzed. To examine student learning, we recruited faculty to participate in deploying four common questions in both statics and dynamics. In statics, each instructor agreed to deploy the same four questions (one each for Rigid Body Equilibrium, Trusses, Frames, and Friction) among their overall deployments of the CW. In addition to answering the question, students were also asked to provide a written explanation to explain their reasoning, to rate the confidence of their answers, and to rate the degree to which the questions were clear and promoted deep thinking. The analysis to date has resulted in a Work-In-Progress paper presented at ASEE 2022, reporting a cross-case comparison of two instructors and a Work-In-Progress paper to be presented at ASEE 2023 analyzing students’ metacognitive reflections of concept questions. 
    more » « less
  5. Introductory engineering courses within large universities often have annual enrollments exceeding several hundreds of students, while online classes have even larger enrollments. It is challenging to achieve differentiated instruction in classrooms with class sizes and student diversity of such great magnitude. In such classes, professors assess whether students have mastered a concept through multiple-choice questions, marking answers as right or wrong with little feedback, or using online text-only systems. However, in these scenarios the feedback is of a mostly binary nature (right or wrong) with limited constructive feedback to scaffold learning. A growing concern among engineering educators is that students are losing both the critical skill of sketched diagrams and the ability to take a real system and reduce it to an accurate but simplified free-body diagram (FBD). A sketch-recognition based tutoring system, called Mechanix, allows students to hand-draw solutions just as they would with pencil and paper, while also providing iterative real-time personalized feedback. Sketch recognition algorithms use artificial intelligence to identify the shapes, their relationships, and other features of the sketched student drawing. Other AI algorithms then determine if and why a student’s work is incorrect, enabling the tutoring system to return immediate and iterative personalized feedback facilitating student learning that is otherwise not possible in large classes. To observe the effectiveness of this system, it has been implemented into various courses at three universities, with two additional universities planning to use the system within the next year. Student knowledge is measured using Concept Inventories based in both Physics and Statics, common exam questions, and assignments turned in for class. Preliminary results using Mechanix, a sketch-based statics tutoring system built at Texas A&M University, suggest that a sketch-based tutoring system increases homework motivation in struggling students and is as effective as paper-and-pencil-based homework for teaching method of joints truss analysis. In focus groups, students believed the system enhanced their learning and increased engagement. Keywords: sketch recognition; intelligent user interfaces; physics education; engineering education 
    more » « less