skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Toward Benchmarking Student Progress in Mechanics: Assessing Learning Cycles through Mastery Learning and Concept Questions
In mechanics, the standard 3-credit, 45-hour course is sufficient to deliver standard lectures with prepared examples and questions. Moreover, it is not only feasible, but preferable, to employ any of a variety of active learning and teaching techniques. Nevertheless, even when active learning is strategically used, students and instructors alike experience pressure to accomplish their respective learning and teaching goals under the constraints of the academic calendar, raising questions as to whether the allocated time is sufficient to enable authentic learning. One way to assess learning progress is to examine the learning cycles through which students attempt, re-think, and re-attempt their work. This article provides data to benchmark the time required to learn key Statics concepts based on results of instruction of approximately 50 students in a Statics class at a public research university during the Fall 2020 semester. Two parallel techniques are employed to foster and understand student learning cycles. • Through a Mastery Based Learning model, 15 weekly pass/fail “Mastery Tests” are given. Students who do not pass may re-test with a different but similar test on the same topic each week until the semester’s conclusion. The tests are highly structured in that they are well posed and highly focused. For example, some tests focus only on drawing Free Body Diagrams, with no equations or calculations. Other tests focus on writing equilibrium equations from a given Free Body Diagram. Passing the first six tests is required to earn the grade of D; passing the next three for C; the next three for B; and the final three for A. Evaluations include coding of student responses to infer student reasoning. Learning cycles occur as students repeat the same topics, and their progress is assessed by passing rates and by comparing evolving responses to the same test topics. • Concept Questions that elicit qualitative responses and written explanations are deployed at least weekly. The learning cycle here consists of students answering a question, seeing the overall class results (but without the correct answer), having a chance to explore the question with other students and the instructor, and finally an opportunity to re-answer the same question, perhaps a few minutes or up to a couple days later. Sometimes, that same question is given a third time to encourage further effort or progress. To date, results from both cycles appear to agree on one important conclusion: the rate of demonstrated learning is quite low. For example, each Mastery Test has a passing rate of 20%-30%, including for students with several repeats. With the Concept Questions, typically no more than half of the students who answered incorrectly change to the correct answer by the time of the final poll. The final article will provide quantitative and qualitative results from each type of cycle, including tracking coded responses on Mastery Tests, written responses on Concept Questions, and cross-comparisons thereof. Additional results will be presented from student surveys. Since the Mastery Tests and Concept Questions follow typical Statics topics, this work has potential to lead to a standardized set of benchmarks and standards for measuring student learning – and its rate – in Statics.  more » « less
Award ID(s):
1821445
PAR ID:
10286226
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
ASEE Virtual Annual Conference
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. Several consensus reports cite a critical need to dramatically increase the number and diversity of STEM graduates over the next decade. They conclude that a change to evidence-based instructional practices, such as concept-based active learning, is needed. Concept-based active learning involves the use of activity-based pedagogies whose primary objectives are to make students value deep conceptual understanding (instead of only factual knowledge) and then to facilitate their development of that understanding. Concept-based active learning has been shown to increase academic engagement and student achievement, to significantly improve student retention in academic programs, and to reduce the performance gap of underrepresented students. Fostering students' mastery of fundamental concepts is central to real world problem solving, including several elements of engineering practice. Unfortunately, simply proving that these instructional practices are more effective than traditional methods for promoting student learning, for increasing retention in academic programs, and for improving ability in professional practice is not enough to ensure widespread pedagogical change. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. In this project we seek to propagate the Concept Warehouse, a technological innovation designed to foster concept-based active learning, into Mechanical Engineering (ME) and to study student learning with this tool in five diverse institutional settings. The Concept Warehouse (CW) is a web-based instructional tool that we developed for Chemical Engineering (ChE) faculty. It houses over 3,500 ConcepTests, which are short questions that can rapidly be deployed to engage students in concept-oriented thinking and/or to assess students’ conceptual knowledge, along with more extensive concept-based active learning tools. The CW has grown rapidly during this project and now has over 1,600 faculty accounts and over 37,000 student users. New ConcepTests were created during the current reporting period; the current numbers of questions for Statics, Dynamics, and Mechanics of Materials are 342, 410, and 41, respectively. A detailed review process is in progress, and will continue through the no-cost extension year, to refine question clarity and to identify types of new questions to fill gaps in content coverage. There have been 497 new faculty accounts created after June 30, 2018, and 3,035 unique students have answered these mechanics questions in the CW. We continue to analyze instructor interviews, focusing on 11 cases, all of whom participated in the CW Community of Practice (CoP). For six participants, we were able to compare use of the CW both before and after participating in professional development activities (workshops and/or a community or practice). Interview results have been coded and are currently being analyzed. To examine student learning, we recruited faculty to participate in deploying four common questions in both statics and dynamics. In statics, each instructor agreed to deploy the same four questions (one each for Rigid Body Equilibrium, Trusses, Frames, and Friction) among their overall deployments of the CW. In addition to answering the question, students were also asked to provide a written explanation to explain their reasoning, to rate the confidence of their answers, and to rate the degree to which the questions were clear and promoted deep thinking. The analysis to date has resulted in a Work-In-Progress paper presented at ASEE 2022, reporting a cross-case comparison of two instructors and a Work-In-Progress paper to be presented at ASEE 2023 analyzing students’ metacognitive reflections of concept questions. 
    more » « less
  2. Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models. The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models. 
    more » « less
  3. Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models. The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models. 
    more » « less
  4. We have investigated the temporal patterns of algebra (N ¼ 606) and calculus (N ¼ 507) introductory physics students practicing multiple basic physics topics several times throughout the semester using an online mastery homework application called science, technology, engineering, and mathematics (STEM) fluency aimed at improving basic physics skills. For all skill practice categories, we observed an increase in measures of student accuracy, such as a decrease in the number of questions attempted to reach mastery, and a decrease in response time per question, resulting in an overall decrease in the total time spent on the assignments. The findings in this study show that there are several factors that impact a student’s performance and evolution on the mastery assignments throughout the semester. For example, using linear mixed modeling, we report that students with lower math preparation for the physics class start with lower accuracy and slower response times on the mastery assignments than students with higher math preparation. However, by the end of the semester, the less prepared students reach similar performance levels to their more prepared classmates on the mastery assignments. This suggests that STEM fluency is a useful tool for instructors to implement to refresh student’s basic math skills. Additionally, gender and procrastination habits impact the effectiveness and progression of the student’s response time and accuracy on the STEM fluency assignments throughout the semester. We find that women initially answer more questions in the same amount of time as men before reaching mastery. As the semester progresses and students practice the categories more, this performance gap diminishes between males and females. In addition, we find that students who procrastinate (those who wait until the final few hours to complete the assignments) are spending more time on the assignments despite answering a similar number of questions as compared to students who do not procrastinate. We also find that student mindset (growth vs fixed mindset) was not related to a student’s progress on the online mastery assignments. Finally, we find that STEM fluency practice improves performance beyond the effects of other components of instruction, such as lectures, group-work recitations, and homework assignments. 
    more » « less
  5. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonly employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool. 
    more » « less