skip to main content


Search for: All records

Award ID contains: 2135190

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Many engineering problems assigned in undergraduate classes are numerical and can be solved using equations and algorithms—for example, truss problems in statics are often solved using the method of joints or the method of sections. Concept questions, which can be administered in class using active learning pedagogies, aid in the development of conceptual understanding as opposed to the procedural skill often emphasized in numerical problems. We administered a concept question about a truss to 241 statics students at six diverse institutions and find no statistically significant differences in answer correctness or confidence between institutions. Across institutions, students report that they are not accustomed to such non-numerical concept questions, but they grapple in different ways with the experience. Some frame engineering as inherently numerical, and thus do not value the conceptual understanding assessed by the question, while others recognize that developing conceptual knowledge is useful and will translate to their future engineering work. 
    more » « less
    Free, publicly-accessible full text available June 1, 2024
  2. Several consensus reports cite a critical need to dramatically increase the number and diversity of STEM graduates over the next decade. They conclude that a change to evidence-based instructional practices, such as concept-based active learning, is needed. Concept-based active learning involves the use of activity-based pedagogies whose primary objectives are to make students value deep conceptual understanding (instead of only factual knowledge) and then to facilitate their development of that understanding. Concept-based active learning has been shown to increase academic engagement and student achievement, to significantly improve student retention in academic programs, and to reduce the performance gap of underrepresented students. Fostering students' mastery of fundamental concepts is central to real world problem solving, including several elements of engineering practice. Unfortunately, simply proving that these instructional practices are more effective than traditional methods for promoting student learning, for increasing retention in academic programs, and for improving ability in professional practice is not enough to ensure widespread pedagogical change. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. In this project we seek to propagate the Concept Warehouse, a technological innovation designed to foster concept-based active learning, into Mechanical Engineering (ME) and to study student learning with this tool in five diverse institutional settings. The Concept Warehouse (CW) is a web-based instructional tool that we developed for Chemical Engineering (ChE) faculty. It houses over 3,500 ConcepTests, which are short questions that can rapidly be deployed to engage students in concept-oriented thinking and/or to assess students’ conceptual knowledge, along with more extensive concept-based active learning tools. The CW has grown rapidly during this project and now has over 1,600 faculty accounts and over 37,000 student users. New ConcepTests were created during the current reporting period; the current numbers of questions for Statics, Dynamics, and Mechanics of Materials are 342, 410, and 41, respectively. A detailed review process is in progress, and will continue through the no-cost extension year, to refine question clarity and to identify types of new questions to fill gaps in content coverage. There have been 497 new faculty accounts created after June 30, 2018, and 3,035 unique students have answered these mechanics questions in the CW. We continue to analyze instructor interviews, focusing on 11 cases, all of whom participated in the CW Community of Practice (CoP). For six participants, we were able to compare use of the CW both before and after participating in professional development activities (workshops and/or a community or practice). Interview results have been coded and are currently being analyzed. To examine student learning, we recruited faculty to participate in deploying four common questions in both statics and dynamics. In statics, each instructor agreed to deploy the same four questions (one each for Rigid Body Equilibrium, Trusses, Frames, and Friction) among their overall deployments of the CW. In addition to answering the question, students were also asked to provide a written explanation to explain their reasoning, to rate the confidence of their answers, and to rate the degree to which the questions were clear and promoted deep thinking. The analysis to date has resulted in a Work-In-Progress paper presented at ASEE 2022, reporting a cross-case comparison of two instructors and a Work-In-Progress paper to be presented at ASEE 2023 analyzing students’ metacognitive reflections of concept questions. 
    more » « less
    Free, publicly-accessible full text available June 1, 2024
  3. his work-in-progress paper expands on a collaboration between engineering education researchers and machine learning researchers to automate the analysis of written responses to conceptually challenging questions in statics and dynamics courses (Authors, 2022). Using the Concept Warehouse (Koretsky et al., 2014), written justifications of ConcepTests (CTs) were gathered from statics and dynamics courses in a diverse set of two- and four-year institutions. Written justifications for CTs have been used to support active learning pedagogies which makes them important to investigate how students put together their problem-solving narratives of understanding. However, despite the large benefit that analysis of student written responses may provide to instructors and researchers, manual review of responses is cumbersome, limits analysis, and can be prone to human bias. In efforts to improve the analysis of student written responses, machine learning has been used in various educational contexts to analyze short and long texts (Burstein et al., 2020; Burstein et al., 2021). Natural Language Processing (NLP) uses transformer-based machine learning models (Brown et al., 2020; Raffel et al., 2019) which can be used through fine-tuning or in-context learning methods. NLP can be used to train algorithms that can automate the coding of written responses. Only a few studies for educational applications have leveraged transformer-based machine learning models further prompting an investigation into its use in STEM education. However, work in NLP has been criticized for heightening the possibility to perpetuate and even amplify harmful stereotypes and implicit biases (Chang et al., 2019; Mayfield et al., 2019). In this study, we detail the aim to use NLP for linguistic justice. Using methods like text summary, topic modeling, and text classification, we identify key aspects of student narratives of understanding in written responses to mechanics and statics CTs. Through this process, we seek to use machine learning to identify different ways students talk about a problem and their understanding at any point in their narrative formation process. Thus, we hope to help reduce human bias in the classroom and through technology by giving instructors and researchers a diverse set of narratives that include insight into their students’ histories, identities, and understanding. These can then be used towards connecting technological knowledge to students’ everyday lives. 
    more » « less
    Free, publicly-accessible full text available June 1, 2024
  4. In this work-in-progress paper, we continue investigation into the propagation of the Concept Warehouse within mechanical engineering (Friedrichsen et al., 2017; Koretsky et al., 2019a). Even before the pandemic forced most instruction online, educational technology was a growing element in classroom culture (Koretsky & Magana, 2019b). However, adoption of technology tools for widespread use is often conceived from a turn-key lens, with professional development focused on procedural competencies and fidelity of implementation as the goal (Mills & Ragan, 2000; O’Donnell, 2008). Educators are given the tool with initial operating instructions, then left on their own to implement it in particular instructional contexts. There is little emphasis on the inevitable instructional decisions around incorporating the tool (Hodge, 2019) or on sustainable incorporation of technologies into existing instructional practice (Forkosh-Baruch et al., 2021). We consider the take-up of a technology tool as an emergent, rather than a prescribed process (Henderson et al., 2011). In this WIP paper, we examine how two instructors who we call Al and Joe reason through their adoption of a technology tool, focusing on interactions among instructors, tool, and students within and across contexts. The Concept Warehouse (CW) is a widely-available, web-based, open educational technology tool used to facilitate concept-based active learning in different contexts (Friedrichsen et al., 2017; Koretsky et al., 2014). Development of the CW is ongoing and collaboration-driven, where user-instructors from different institutions and disciplines can develop conceptual questions (called ConcepTests) and other learning and assessment tools that can be shared with other users. Currently there are around 3,500 ConcepTests, 1,500 faculty users, and 36,000 student users. About 700 ConcepTests have been developed for mechanics (statics and dynamics). The tool’s spectrum of affordances allows different entry points for instructor engagement, but also allows their use to grow and change as they become familiar with the tool and take up ideas from the contexts around them. Part of a larger study of propagation and use across five diverse institutions (Nolen & Koretsky, 2020), instructors were introduced to the tool, offered an introductory workshop and opportunity to participate in a community of practice (CoP), then interviewed early and later in their adoption. For this paper, we explore a bounded case study of the two instructors, Al and Joe, who took up the CW to teach Introductory Statics. Al and Joe were experienced instructors, committed to active learning, who presented examples from their ongoing adaptation of the tool for discussion in the community of practice. However, their decisions about how to integrate the tool fundamentally differed, including the aspects of the tool they took up and the ways they made sense of their use. In analyzing these two cases, we begin to uncover how these instructors navigated the dynamic nature of pedagogical decision making in and across contexts. 
    more » « less
  5. The emphasis on conceptual learning and the development of adaptive instructional design are both emerging areas in science and engineering education. Instructors are writing their own conceptual questions to promote active learning during class and utilizing pools of these questions in assessments. For adaptive assessment strategies, these questions need to be rated based on difficulty level (DL). Historically DL has been determined from the performance of a suitable number of students. The research study reported here investigates whether instructors can save time by predicting DL of newly made conceptual questions without the need for student data. In this paper, we report on the development of one component in an adaptive learning module for materials science – specifically on the topic of crystallography. The summative assessment element consists of five DL scales and 15 conceptual questions This adaptive assessment directs students based on their previous performances and the DL of the questions. Our five expert participants are faculty members who have taught the introductory Materials Science course multiple times. They provided predictions for how many students would answer each question correctly during a two-step process. First, predictions were made individually without an answer key. Second, experts had the opportunity to revise their predictions after being provided an answer key in a group discussion. We compared expert predictions with actual student performance using results from over 400 students spanning multiple courses and terms. We found no clear correlation between expert predictions of the DL and the measured DL from students. Some evidence shows that discussion during the second step made expert predictions closer to student performance. We suggest that, in determining the DL for conceptual questions, using predictions of the DL by experts who have taught the course is not a valid route. The findings in this paper can be applied to assessments in both in-person, hybrid, and online settings and is applicable to subject matter beyond materials science. 
    more » « less
  6. This work-in-progress paper describes a collaborative effort between engineering education and machine learning researchers to automate analysis of written responses to conceptually challenging questions in mechanics. These qualitative questions are often used in large STEM classes to support active learning pedagogies; they require minimum calculations and focus on the application of underlying physical phenomena to various situations. Active learning pedagogies using this type of questions has been demonstrated to increase student achievement (Freeman et al., 2014; Hake, 1998) and engagement (Deslauriers, et al., 2011) of all students (Haak et al., 2011). To emphasize reasoning and sense-making, we use the Concept Warehouse (Koretsky et al., 2014), an audience response system where students provide written justifications to concept questions. Written justifications better prepare students for discussions with peers and in the whole class and can also improve students’ answer choices (Koretsky et al., 2016a, 2016b). In addition to their use as a tool to foster learning, written explanations can also provide valuable information to concurrently assess that learning (Koretsky and Magana, 2019). However, in practice, there has been limited deployment of written justifications with concept questions, in part, because they provide a daunting amount of information for instructors to process and for researchers to analyze. In this study, we describe the initial evaluation of large pre-trained generative sequence-to-sequence language models (Raffel et al., 2019; Brown et al., 2020) to automate the laborious coding process of student written responses. Adaptation of machine learning algorithms in this context is challenging since each question targets specific concepts which elicit their own unique reasoning processes. This exploratory project seeks to utilize responses collected through the Concept Warehouse to identify viable strategies for adapting machine learning to support instructors and researchers in identifying salient aspects of student thinking and understanding with these conceptually challenging questions. 
    more » « less
  7. Our previous work demonstrated that the use of inquiry-based laboratory activities (IBLAs) have helped students develop better understanding of core concepts in mechanics. IBLAs are constructed around brief hands-on-experiments designed so that students can confront common misconceptions. In a predict-observe-explain sequence, these activities prompt students to make sense of a phenomenon as they work collaboratively through a guided worksheet. However, these physical experiments present logistical challenges for many instructors, such as those who teach large classes or those confined to remote instruction due to the COVID 19 pandemic. In this work-in-progress paper, we describe the development of computer simulations for a set of IBLAs in mechanics. These web-based virtual IBLAs contain simulations built with an open-source JavaScript physics engine that has been customized to achieve the accuracy needed. They afford the same pedagogical structure but allow students to observe the salient phenomena on a computer screen, reducing the constraints and limitations for an instructor to deliver them. Students can rapidly adjust input parameters, render the physics engine in slow-motion speeds, and graph real-time parameters from the simulation. Free access to the IBLAs, including simulations, handouts and instructions is available to instructors through the Concept Warehouse. In this we report how we rendered a set of proven IBLAs, including the Spool IBLA, Rolling Cylinders IBLA and Pendulum IBLA into a virtual laboratory environment. We describe student responses to different renderings including video only, simulation only, and combined video and simulation. 
    more » « less
  8. It has been well-established that concept-based active learning strategies increase student retention, improve engagement and student achievement, and reduce the performance gap of underrepresented students. Despite the evidence supporting concept-based instruction, many faculty continue to stress algorithmic problem solving. In fact, the biggest challenge to improving STEM education is not the need to develop more effective instructional practices, but to find ways to get faculty to adopt the evidence-based pedagogies that already exist. Our project aims to propagate the Concept Warehouse (CW), an online innovation tool that was developed in the Chemical Engineering community, into Mechanical Engineering (ME). A portion of our work focuses on content development in mechanics, and includes statics, dynamics, and to a lesser extent strength of materials. Our content development teams had created 170 statics and 253 dynamics questions. Additionally, we have developed four different simulations to be embedded in online Instructional Tools – these are interactive modules that provided different physical scenarios to help students understand important concepts in mechanics. During initial interviews, we found that potential adopters needed coaching on the benefits of concept-based instruction, training on how to use the CW, and support on how to best implement the different affordances offered by the CW. This caused a slight shift in our initial research plans, and much of our recent work has concentrated on using faculty development activities to help us advertise the CW and encourage evidence-based practices. From these activities, we are recruiting participants for surveys and interviews to help us investigate how different contexts affect the adoption of educational innovations. A set of two summer workshops attracted over 270 applicants, and over 60 participants attended each synchronous offering. Other applicants were provided links to recordings of the workshop. From these participants, we recruited 20 participants to join our Community of Practice (CoP). These members are sharing how they use the CW in their classes, especially in the virtual environment. Community members discuss using evidence-based practices, different things that the CW can do, and suggest potential improvements to the tool. They will also be interviewed to help us determine barriers to adoption, how their institutional contexts and individual epistemologies affect adoption, and how they have used the CW in their classes. Our research will help us formulate strategies that others can use when attempting to propagate pedagogical innovations. 
    more » « less
  9. null (Ed.)
    The shift to remote teaching with the COVID-19 pandemic has made delivery of concept-based active learning more challenging, especially in large-enrollment engineering classes. I report here a modification in the Concept Warehouse to support delivery of concept questions. The new feature allows instructors to make students’ reasoning visible to other students by showing selected written explanations to conceptually challenging multiple-choice questions. Data were collected for two large-enrollment engineering classes where examples are shown to illustrate how displaying written explanations can provide a resource for students to develop multi-variate reasoning skills. 
    more » « less