Several consensus reports cite a critical need to dramatically increase the number and diversity of STEM graduates over the next decade. They conclude that a change to evidence-based instructional practices, such as concept-based active learning, is needed. Concept-based active learning involves the use of activity-based pedagogies whose primary objectives are to make students value deep conceptual understanding (instead of only factual knowledge) and then to facilitate their development of that understanding. Concept-based active learning has been shown to increase academic engagement and student achievement, to significantly improve student retention in academic programs, and to reduce the performance gap of underrepresented students. Fostering students' mastery of fundamental concepts is central to real world problem solving, including several elements of engineering practice. Unfortunately, simply proving that these instructional practices are more effective than traditional methods for promoting student learning, for increasing retention in academic programs, and for improving ability in professional practice is not enough to ensure widespread pedagogical change. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. In this project we seek to propagate the Concept Warehouse, a technological innovation designed to foster concept-based active learning, into Mechanical Engineering (ME) and to study student learning with this tool in five diverse institutional settings. The Concept Warehouse (CW) is a web-based instructional tool that we developed for Chemical Engineering (ChE) faculty. It houses over 3,500 ConcepTests, which are short questions that can rapidly be deployed to engage students in concept-oriented thinking and/or to assess students’ conceptual knowledge, along with more extensive concept-based active learning tools. The CW has grown rapidly during this project and now has over 1,600 faculty accounts and over 37,000 student users. New ConcepTests were created during the current reporting period; the current numbers of questions for Statics, Dynamics, and Mechanics of Materials are 342, 410, and 41, respectively. A detailed review process is in progress, and will continue through the no-cost extension year, to refine question clarity and to identify types of new questions to fill gaps in content coverage. There have been 497 new faculty accounts created after June 30, 2018, and 3,035 unique students have answered these mechanics questions in the CW. We continue to analyze instructor interviews, focusing on 11 cases, all of whom participated in the CW Community of Practice (CoP). For six participants, we were able to compare use of the CW both before and after participating in professional development activities (workshops and/or a community or practice). Interview results have been coded and are currently being analyzed. To examine student learning, we recruited faculty to participate in deploying four common questions in both statics and dynamics. In statics, each instructor agreed to deploy the same four questions (one each for Rigid Body Equilibrium, Trusses, Frames, and Friction) among their overall deployments of the CW. In addition to answering the question, students were also asked to provide a written explanation to explain their reasoning, to rate the confidence of their answers, and to rate the degree to which the questions were clear and promoted deep thinking. The analysis to date has resulted in a Work-In-Progress paper presented at ASEE 2022, reporting a cross-case comparison of two instructors and a Work-In-Progress paper to be presented at ASEE 2023 analyzing students’ metacognitive reflections of concept questions.
more »
« less
Toward Benchmarking Student Progress in Mechanics: Assessing Learning Cycles through Mastery Learning and Concept Questions
In mechanics, the standard 3-credit, 45-hour course is sufficient to deliver standard lectures with prepared examples and questions. Moreover, it is not only feasible, but preferable, to employ any of a variety of active learning and teaching techniques. Nevertheless, even when active learning is strategically used, students and instructors alike experience pressure to accomplish their respective learning and teaching goals under the constraints of the academic calendar, raising questions as to whether the allocated time is sufficient to enable authentic learning. One way to assess learning progress is to examine the learning cycles through which students attempt, re-think, and re-attempt their work. This article provides data to benchmark the time required to learn key Statics concepts based on results of instruction of approximately 50 students in a Statics class at a public research university during the Fall 2020 semester. Two parallel techniques are employed to foster and understand student learning cycles.
• Through a Mastery Based Learning model, 15 weekly pass/fail “Mastery Tests” are given. Students who do not pass may re-test with a different but similar test on the same topic each week until the semester’s conclusion. The tests are highly structured in that they are well posed and highly focused. For example, some tests focus only on drawing Free Body Diagrams, with no equations or calculations. Other tests focus on writing equilibrium equations from a given Free Body Diagram. Passing the first six tests is required to earn the grade of D; passing the next three for C; the next three for B; and the final three for A. Evaluations include coding of student responses to infer student reasoning. Learning cycles occur as students repeat the same topics, and their progress is assessed by passing rates and by comparing evolving responses to the same test topics.
• Concept Questions that elicit qualitative responses and written explanations are deployed at least weekly. The learning cycle here consists of students answering a question, seeing the overall class results (but without the correct answer), having a chance to explore the question with other students and the instructor, and finally an opportunity to re-answer the same question, perhaps a few minutes or up to a couple days later. Sometimes, that same question is given a third time to encourage further effort or progress.
To date, results from both cycles appear to agree on one important conclusion: the rate of demonstrated learning is quite low. For example, each Mastery Test has a passing rate of 20%-30%, including for students with several repeats. With the Concept Questions, typically no more than half of the students who answered incorrectly change to the correct answer by the time of the final poll. The final article will provide quantitative and qualitative results from each type of cycle, including tracking coded responses on Mastery Tests, written responses on Concept Questions, and cross-comparisons thereof. Additional results will be presented from student surveys. Since the Mastery Tests and Concept Questions follow typical Statics topics, this work has potential to lead to a standardized set of benchmarks and standards for measuring student learning – and its rate – in Statics.
more »
« less
- Award ID(s):
- 1821445
- PAR ID:
- 10286226
- Date Published:
- Journal Name:
- ASEE Virtual Annual Conference
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models. The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models.more » « less
-
Mechanics instructors frequently employ hands-on learning with goals such as demonstrating physical phenomena, aiding visualization, addressing misconceptions, exposing students to “real-world” problems, and promoting an engaging classroom environment. This paper presents results from a study exploring the importance of the “hands-on” aspect of a hands-on modeling curriculum we have been developing that spans several topics in statics. The curriculum integrates deep conceptual exploration with analysis procedure tutorials and aims to scaffold students’ development of representational competence, the ability to use multiple representations of a concept as appropriate for learning, problem solving, and communication. We conducted this study over two subsequent terms in an online statics course taught in the context of remote learning amidst the COVID-19 pandemic. The intervention section used a take-home adaptation of the original classroom curriculum. This adaptation consisted of eight activity worksheets with a supplied kit of manipulatives and model-building supplies students could use to construct and explore concrete representations of figures and diagrams used in the worksheets. In contrast, the control section used activity worksheets nearly identical to those used in the hands-on curriculum, but without the associated modeling parts kit. We only made minor revisions to the worksheets to remove reference to the models. The control and intervention sections were otherwise identical in how they were taught by the same instructor. We compare learning outcomes between the two sections as measured via pre-post administration of a test of 3D vector concepts and representations called the Test of Representational Competence with Vectors (TRCV). We also compare end of course scores on the Concept Assessment Test in Statics (CATS) and final exam scores. In addition, we analyze student responses on two “multiple choice plus explain” concept questions paired with each of five activities covering the topics of 3D moments, 3D particle equilibrium, rigid body equilibrium (2D and 3D), and frame analysis (2D). The mean pre/post gain across all ten questions was higher for the intervention section, with the largest differences observed on questions relating to 3D rigid body equilibrium. Students in the intervention section also made larger gains on the TRCV and scored better on the final exam compared to the control section, but these results are not statistically significant perhaps due to the small study population. There were no appreciable differences in end-of-course CATS scores. We also present student feedback on the activity worksheets that was slightly more positive for the versions with the models.more » « less
-
We have investigated the temporal patterns of algebra (N ¼ 606) and calculus (N ¼ 507) introductory physics students practicing multiple basic physics topics several times throughout the semester using an online mastery homework application called science, technology, engineering, and mathematics (STEM) fluency aimed at improving basic physics skills. For all skill practice categories, we observed an increase in measures of student accuracy, such as a decrease in the number of questions attempted to reach mastery, and a decrease in response time per question, resulting in an overall decrease in the total time spent on the assignments. The findings in this study show that there are several factors that impact a student’s performance and evolution on the mastery assignments throughout the semester. For example, using linear mixed modeling, we report that students with lower math preparation for the physics class start with lower accuracy and slower response times on the mastery assignments than students with higher math preparation. However, by the end of the semester, the less prepared students reach similar performance levels to their more prepared classmates on the mastery assignments. This suggests that STEM fluency is a useful tool for instructors to implement to refresh student’s basic math skills. Additionally, gender and procrastination habits impact the effectiveness and progression of the student’s response time and accuracy on the STEM fluency assignments throughout the semester. We find that women initially answer more questions in the same amount of time as men before reaching mastery. As the semester progresses and students practice the categories more, this performance gap diminishes between males and females. In addition, we find that students who procrastinate (those who wait until the final few hours to complete the assignments) are spending more time on the assignments despite answering a similar number of questions as compared to students who do not procrastinate. We also find that student mindset (growth vs fixed mindset) was not related to a student’s progress on the online mastery assignments. Finally, we find that STEM fluency practice improves performance beyond the effects of other components of instruction, such as lectures, group-work recitations, and homework assignments.more » « less
-
This work-in-progress paper describes a collaborative effort between engineering education and machine learning researchers to automate analysis of written responses to conceptually challenging questions in mechanics. These qualitative questions are often used in large STEM classes to support active learning pedagogies; they require minimum calculations and focus on the application of underlying physical phenomena to various situations. Active learning pedagogies using this type of questions has been demonstrated to increase student achievement (Freeman et al., 2014; Hake, 1998) and engagement (Deslauriers, et al., 2011) of all students (Haak et al., 2011). To emphasize reasoning and sense-making, we use the Concept Warehouse (Koretsky et al., 2014), an audience response system where students provide written justifications to concept questions. Written justifications better prepare students for discussions with peers and in the whole class and can also improve students’ answer choices (Koretsky et al., 2016a, 2016b). In addition to their use as a tool to foster learning, written explanations can also provide valuable information to concurrently assess that learning (Koretsky and Magana, 2019). However, in practice, there has been limited deployment of written justifications with concept questions, in part, because they provide a daunting amount of information for instructors to process and for researchers to analyze. In this study, we describe the initial evaluation of large pre-trained generative sequence-to-sequence language models (Raffel et al., 2019; Brown et al., 2020) to automate the laborious coding process of student written responses. Adaptation of machine learning algorithms in this context is challenging since each question targets specific concepts which elicit their own unique reasoning processes. This exploratory project seeks to utilize responses collected through the Concept Warehouse to identify viable strategies for adapting machine learning to support instructors and researchers in identifying salient aspects of student thinking and understanding with these conceptually challenging questions.more » « less