skip to main content

Title: Toward Benchmarking Student Progress in Mechanics: Assessing Learning Cycles through Mastery Learning and Concept Questions
In mechanics, the standard 3-credit, 45-hour course is sufficient to deliver standard lectures with prepared examples and questions. Moreover, it is not only feasible, but preferable, to employ any of a variety of active learning and teaching techniques. Nevertheless, even when active learning is strategically used, students and instructors alike experience pressure to accomplish their respective learning and teaching goals under the constraints of the academic calendar, raising questions as to whether the allocated time is sufficient to enable authentic learning. One way to assess learning progress is to examine the learning cycles through which students attempt, re-think, and re-attempt their work. This article provides data to benchmark the time required to learn key Statics concepts based on results of instruction of approximately 50 students in a Statics class at a public research university during the Fall 2020 semester. Two parallel techniques are employed to foster and understand student learning cycles. • Through a Mastery Based Learning model, 15 weekly pass/fail “Mastery Tests” are given. Students who do not pass may re-test with a different but similar test on the same topic each week until the semester’s conclusion. The tests are highly structured in that they are well posed and highly more » focused. For example, some tests focus only on drawing Free Body Diagrams, with no equations or calculations. Other tests focus on writing equilibrium equations from a given Free Body Diagram. Passing the first six tests is required to earn the grade of D; passing the next three for C; the next three for B; and the final three for A. Evaluations include coding of student responses to infer student reasoning. Learning cycles occur as students repeat the same topics, and their progress is assessed by passing rates and by comparing evolving responses to the same test topics. • Concept Questions that elicit qualitative responses and written explanations are deployed at least weekly. The learning cycle here consists of students answering a question, seeing the overall class results (but without the correct answer), having a chance to explore the question with other students and the instructor, and finally an opportunity to re-answer the same question, perhaps a few minutes or up to a couple days later. Sometimes, that same question is given a third time to encourage further effort or progress. To date, results from both cycles appear to agree on one important conclusion: the rate of demonstrated learning is quite low. For example, each Mastery Test has a passing rate of 20%-30%, including for students with several repeats. With the Concept Questions, typically no more than half of the students who answered incorrectly change to the correct answer by the time of the final poll. The final article will provide quantitative and qualitative results from each type of cycle, including tracking coded responses on Mastery Tests, written responses on Concept Questions, and cross-comparisons thereof. Additional results will be presented from student surveys. Since the Mastery Tests and Concept Questions follow typical Statics topics, this work has potential to lead to a standardized set of benchmarks and standards for measuring student learning – and its rate – in Statics. « less
Authors:
; ; ; ;
Award ID(s):
1821445
Publication Date:
NSF-PAR ID:
10286226
Journal Name:
ASEE Virtual Annual Conference
Sponsoring Org:
National Science Foundation
More Like this
  1. Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models.more »The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models.« less
  2. Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models.more »The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models.« less
  3. The landscapes of many elementary, middle, and high school math classrooms have undergone major transformations over the last half-century, moving from drill-and-skill work to more conceptual reasoning and hands-on manipulative work. However, if you look at a college level calculus class you are likely to find the main difference is the professor now has a whiteboard marker in hand rather than a piece of chalk. It is possible that some student work may be done on the computer, but much of it contains the same type of repetitive skill building problems. This should seem strange given the advancements in technology that allow more freedom than ever to build connections between different representations of a concept. Several class activities have been developed using a combination of approaches, depending on the topic. Topics covered in the activities include Riemann Sums, Accumulation, Center of Mass, Volumes of Revolution (Discs, Washers, and Shells), and Volumes of Similar Cross-section. All activities use student note outlines that are either done in a whole group interactive-lecture approach, or in a group work inquiry-based approach. Some of the activities use interactive graphs designed on desmos.com and others use physical models that have been designed in OpenSCAD and 3D-printedmore »for students to use in class. Tactile objects were developed because they should provide an advantage to students by enabling them to physically interact with the concepts being taught, deepening their involvement with the material, and providing more stimuli for the brain to encode the learning experience. Web-based activities were developed because the topics involved needed substantial changes in graphical representations (i.e. limits with Riemann Sums). Assessment techniques for each topic include online homework, exams, and online concept questions with an explanation response area. These concept questions are intended to measure students’ ability to use multiple representations in order to answer the question, and are not generally computational in nature. Students are also given surveys to rate the overall activities as well as finer grained survey questions to try and elicit student thoughts on certain aspects of the models, websites, and activity sheets. We will report on student responses to the activity surveys, looking for common themes in students’ thoughts toward specific attributes of the activities. We will also compare relevant exam question responses and online concept question results, including common themes present or absent in student reasoning.« less
  4. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonlymore »employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool.« less
  5. In teaching mechanics, we use multiple representations of vectors to develop concepts and analysis techniques. These representations include pictorials, diagrams, symbols, numbers and narrative language. Through years of study as students, researchers, and teachers, we develop a fluency rooted in a deep conceptual understanding of what each representation communicates. Many novice learners, however, struggle to gain such understanding and rely on superficial mimicry of the problem solving procedures we demonstrate in examples. The term representational competence refers to the ability to interpret, switch between, and use multiple representations of a concept as appropriate for learning, communication and analysis. In engineering statics, an understanding of what each vector representation communicates and how to use different representations in problem solving is important to the development of both conceptual and procedural knowledge. Science education literature identifies representational competence as a marker of true conceptual understanding. This paper presents development work for a new assessment instrument designed to measure representational competence with vectors in an engineering mechanics context. We developed the assessment over two successive terms in statics courses at a community college, a medium-sized regional university, and a large state university. We started with twelve multiple-choice questions that survey the vector representations commonlymore »employed in statics. Each question requires the student to interpret and/or use two or more different representations of vectors and requires no calculation beyond single digit integer arithmetic. Distractor answer choices include common student mistakes and misconceptions drawn from the literature and from our teaching experience. We piloted these twelve questions as a timed section of the first exam in fall 2018 statics courses at both Whatcom Community College (WCC) and Western Washington University. Analysis of students’ unprompted use of vector representations on the open-ended problem-solving section of the same exam provides evidence of the assessment’s validity as a measurement instrument for representational competence. We found a positive correlation between students’ accurate and effective use of representations and their score on the multiple choice test. We gathered additional validity evidence by reviewing student responses on an exam wrapper reflection. We used item difficulty and item discrimination scores (point-biserial correlation) to eliminate two questions and revised the remaining questions to improve clarity and discriminatory power. We administered the revised version in two contexts: (1) again as part of the first exam in the winter 2019 Statics course at WCC, and (2) as an extra credit opportunity for statics students at Utah State University. This paper includes sample questions from the assessment to illustrate the approach. The full assessment is available to interested instructors and researchers through an online tool.« less