As the use of computers in education increases, adaptive learning platforms are becoming more common. However, these adaptive systems are typically designed to support acquisition of declarative knowledge and/or procedural fluency but rarely address conceptual learning. In this work, we developed the Crystallography Adaptive Learning Module (CALM) for materials science to provide students a tool for individualized conceptual learning. We used a randomized quasi‐experimental design comparing two instructional designs with different levels of computer‐provided direction and student agency. Undergraduate students were randomly assigned to one of two different instructional designs; one design had students complete an individualized, adaptive path using the CALM (
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Abstract N = 80), and the other gave students the freedom to explore CALM's learning resources but with limited guidance (N = 85). Within these two designs, we also investigated students among different cumulative grade point average (GPA) groups. While there was no statistically significant difference in the measure of conceptual understanding between instructional designs or among the groups with the same GPA, there is evidence to suggest the CALM improves conceptual understanding of students in the middle GPA group. Students using CALM also showed increased participation with the interactive learning videos compared to the other design. The number of videos watched in each instructional condition aligns with overall academic performance as the low GPA group received the most assigned supplements but watched the least videos by choice. This study provides insight for technology developers on how to develop educational adaptive technology systems that provide a proper level of student agency to promote conceptual understanding in challenging STEM topics.Free, publicly-accessible full text available August 27, 2025 -
Free, publicly-accessible full text available June 1, 2025
-
Free, publicly-accessible full text available June 1, 2025
-
Free, publicly-accessible full text available June 1, 2025
-
Many engineering problems assigned in undergraduate classes are numerical and can be solved using equations and algorithms—for example, truss problems in statics are often solved using the method of joints or the method of sections. Concept questions, which can be administered in class using active learning pedagogies, aid in the development of conceptual understanding as opposed to the procedural skill often emphasized in numerical problems. We administered a concept question about a truss to 241 statics students at six diverse institutions and find no statistically significant differences in answer correctness or confidence between institutions. Across institutions, students report that they are not accustomed to such non-numerical concept questions, but they grapple in different ways with the experience. Some frame engineering as inherently numerical, and thus do not value the conceptual understanding assessed by the question, while others recognize that developing conceptual knowledge is useful and will translate to their future engineering work.more » « less
-
his work-in-progress paper expands on a collaboration between engineering education researchers and machine learning researchers to automate the analysis of written responses to conceptually challenging questions in statics and dynamics courses (Authors, 2022). Using the Concept Warehouse (Koretsky et al., 2014), written justifications of ConcepTests (CTs) were gathered from statics and dynamics courses in a diverse set of two- and four-year institutions. Written justifications for CTs have been used to support active learning pedagogies which makes them important to investigate how students put together their problem-solving narratives of understanding. However, despite the large benefit that analysis of student written responses may provide to instructors and researchers, manual review of responses is cumbersome, limits analysis, and can be prone to human bias. In efforts to improve the analysis of student written responses, machine learning has been used in various educational contexts to analyze short and long texts (Burstein et al., 2020; Burstein et al., 2021). Natural Language Processing (NLP) uses transformer-based machine learning models (Brown et al., 2020; Raffel et al., 2019) which can be used through fine-tuning or in-context learning methods. NLP can be used to train algorithms that can automate the coding of written responses. Only a few studies for educational applications have leveraged transformer-based machine learning models further prompting an investigation into its use in STEM education. However, work in NLP has been criticized for heightening the possibility to perpetuate and even amplify harmful stereotypes and implicit biases (Chang et al., 2019; Mayfield et al., 2019). In this study, we detail the aim to use NLP for linguistic justice. Using methods like text summary, topic modeling, and text classification, we identify key aspects of student narratives of understanding in written responses to mechanics and statics CTs. Through this process, we seek to use machine learning to identify different ways students talk about a problem and their understanding at any point in their narrative formation process. Thus, we hope to help reduce human bias in the classroom and through technology by giving instructors and researchers a diverse set of narratives that include insight into their students’ histories, identities, and understanding. These can then be used towards connecting technological knowledge to students’ everyday lives.more » « less
-
Several consensus reports cite a critical need to dramatically increase the number and diversity of STEM graduates over the next decade. They conclude that a change to evidence-based instructional practices, such as concept-based active learning, is needed. Concept-based active learning involves the use of activity-based pedagogies whose primary objectives are to make students value deep conceptual understanding (instead of only factual knowledge) and then to facilitate their development of that understanding. Concept-based active learning has been shown to increase academic engagement and student achievement, to significantly improve student retention in academic programs, and to reduce the performance gap of underrepresented students. Fostering students' mastery of fundamental concepts is central to real world problem solving, including several elements of engineering practice. Unfortunately, simply proving that these instructional practices are more effective than traditional methods for promoting student learning, for increasing retention in academic programs, and for improving ability in professional practice is not enough to ensure widespread pedagogical change. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. In this project we seek to propagate the Concept Warehouse, a technological innovation designed to foster concept-based active learning, into Mechanical Engineering (ME) and to study student learning with this tool in five diverse institutional settings. The Concept Warehouse (CW) is a web-based instructional tool that we developed for Chemical Engineering (ChE) faculty. It houses over 3,500 ConcepTests, which are short questions that can rapidly be deployed to engage students in concept-oriented thinking and/or to assess students’ conceptual knowledge, along with more extensive concept-based active learning tools. The CW has grown rapidly during this project and now has over 1,600 faculty accounts and over 37,000 student users. New ConcepTests were created during the current reporting period; the current numbers of questions for Statics, Dynamics, and Mechanics of Materials are 342, 410, and 41, respectively. A detailed review process is in progress, and will continue through the no-cost extension year, to refine question clarity and to identify types of new questions to fill gaps in content coverage. There have been 497 new faculty accounts created after June 30, 2018, and 3,035 unique students have answered these mechanics questions in the CW. We continue to analyze instructor interviews, focusing on 11 cases, all of whom participated in the CW Community of Practice (CoP). For six participants, we were able to compare use of the CW both before and after participating in professional development activities (workshops and/or a community or practice). Interview results have been coded and are currently being analyzed. To examine student learning, we recruited faculty to participate in deploying four common questions in both statics and dynamics. In statics, each instructor agreed to deploy the same four questions (one each for Rigid Body Equilibrium, Trusses, Frames, and Friction) among their overall deployments of the CW. In addition to answering the question, students were also asked to provide a written explanation to explain their reasoning, to rate the confidence of their answers, and to rate the degree to which the questions were clear and promoted deep thinking. The analysis to date has resulted in a Work-In-Progress paper presented at ASEE 2022, reporting a cross-case comparison of two instructors and a Work-In-Progress paper to be presented at ASEE 2023 analyzing students’ metacognitive reflections of concept questions.more » « less
-
This work-in-progress paper describes a collaborative effort between engineering education and machine learning researchers to automate analysis of written responses to conceptually challenging questions in mechanics. These qualitative questions are often used in large STEM classes to support active learning pedagogies; they require minimum calculations and focus on the application of underlying physical phenomena to various situations. Active learning pedagogies using this type of questions has been demonstrated to increase student achievement (Freeman et al., 2014; Hake, 1998) and engagement (Deslauriers, et al., 2011) of all students (Haak et al., 2011). To emphasize reasoning and sense-making, we use the Concept Warehouse (Koretsky et al., 2014), an audience response system where students provide written justifications to concept questions. Written justifications better prepare students for discussions with peers and in the whole class and can also improve students’ answer choices (Koretsky et al., 2016a, 2016b). In addition to their use as a tool to foster learning, written explanations can also provide valuable information to concurrently assess that learning (Koretsky and Magana, 2019). However, in practice, there has been limited deployment of written justifications with concept questions, in part, because they provide a daunting amount of information for instructors to process and for researchers to analyze. In this study, we describe the initial evaluation of large pre-trained generative sequence-to-sequence language models (Raffel et al., 2019; Brown et al., 2020) to automate the laborious coding process of student written responses. Adaptation of machine learning algorithms in this context is challenging since each question targets specific concepts which elicit their own unique reasoning processes. This exploratory project seeks to utilize responses collected through the Concept Warehouse to identify viable strategies for adapting machine learning to support instructors and researchers in identifying salient aspects of student thinking and understanding with these conceptually challenging questions.more » « less
-
The emphasis on conceptual learning and the development of adaptive instructional design are both emerging areas in science and engineering education. Instructors are writing their own conceptual questions to promote active learning during class and utilizing pools of these questions in assessments. For adaptive assessment strategies, these questions need to be rated based on difficulty level (DL). Historically DL has been determined from the performance of a suitable number of students. The research study reported here investigates whether instructors can save time by predicting DL of newly made conceptual questions without the need for student data. In this paper, we report on the development of one component in an adaptive learning module for materials science – specifically on the topic of crystallography. The summative assessment element consists of five DL scales and 15 conceptual questions This adaptive assessment directs students based on their previous performances and the DL of the questions. Our five expert participants are faculty members who have taught the introductory Materials Science course multiple times. They provided predictions for how many students would answer each question correctly during a two-step process. First, predictions were made individually without an answer key. Second, experts had the opportunity to revise their predictions after being provided an answer key in a group discussion. We compared expert predictions with actual student performance using results from over 400 students spanning multiple courses and terms. We found no clear correlation between expert predictions of the DL and the measured DL from students. Some evidence shows that discussion during the second step made expert predictions closer to student performance. We suggest that, in determining the DL for conceptual questions, using predictions of the DL by experts who have taught the course is not a valid route. The findings in this paper can be applied to assessments in both in-person, hybrid, and online settings and is applicable to subject matter beyond materials science.more » « less
-
In this work-in-progress paper, we continue investigation into the propagation of the Concept Warehouse within mechanical engineering (Friedrichsen et al., 2017; Koretsky et al., 2019a). Even before the pandemic forced most instruction online, educational technology was a growing element in classroom culture (Koretsky & Magana, 2019b). However, adoption of technology tools for widespread use is often conceived from a turn-key lens, with professional development focused on procedural competencies and fidelity of implementation as the goal (Mills & Ragan, 2000; O’Donnell, 2008). Educators are given the tool with initial operating instructions, then left on their own to implement it in particular instructional contexts. There is little emphasis on the inevitable instructional decisions around incorporating the tool (Hodge, 2019) or on sustainable incorporation of technologies into existing instructional practice (Forkosh-Baruch et al., 2021). We consider the take-up of a technology tool as an emergent, rather than a prescribed process (Henderson et al., 2011). In this WIP paper, we examine how two instructors who we call Al and Joe reason through their adoption of a technology tool, focusing on interactions among instructors, tool, and students within and across contexts. The Concept Warehouse (CW) is a widely-available, web-based, open educational technology tool used to facilitate concept-based active learning in different contexts (Friedrichsen et al., 2017; Koretsky et al., 2014). Development of the CW is ongoing and collaboration-driven, where user-instructors from different institutions and disciplines can develop conceptual questions (called ConcepTests) and other learning and assessment tools that can be shared with other users. Currently there are around 3,500 ConcepTests, 1,500 faculty users, and 36,000 student users. About 700 ConcepTests have been developed for mechanics (statics and dynamics). The tool’s spectrum of affordances allows different entry points for instructor engagement, but also allows their use to grow and change as they become familiar with the tool and take up ideas from the contexts around them. Part of a larger study of propagation and use across five diverse institutions (Nolen & Koretsky, 2020), instructors were introduced to the tool, offered an introductory workshop and opportunity to participate in a community of practice (CoP), then interviewed early and later in their adoption. For this paper, we explore a bounded case study of the two instructors, Al and Joe, who took up the CW to teach Introductory Statics. Al and Joe were experienced instructors, committed to active learning, who presented examples from their ongoing adaptation of the tool for discussion in the community of practice. However, their decisions about how to integrate the tool fundamentally differed, including the aspects of the tool they took up and the ways they made sense of their use. In analyzing these two cases, we begin to uncover how these instructors navigated the dynamic nature of pedagogical decision making in and across contexts.more » « less