skip to main content


Title: Towards an Adaptive Learning Module for Materials Science: Comparing Expert Predictions to Student Performance
The emphasis on conceptual learning and the development of adaptive instructional design are both emerging areas in science and engineering education. Instructors are writing their own conceptual questions to promote active learning during class and utilizing pools of these questions in assessments. For adaptive assessment strategies, these questions need to be rated based on difficulty level (DL). Historically DL has been determined from the performance of a suitable number of students. The research study reported here investigates whether instructors can save time by predicting DL of newly made conceptual questions without the need for student data. In this paper, we report on the development of one component in an adaptive learning module for materials science – specifically on the topic of crystallography. The summative assessment element consists of five DL scales and 15 conceptual questions This adaptive assessment directs students based on their previous performances and the DL of the questions. Our five expert participants are faculty members who have taught the introductory Materials Science course multiple times. They provided predictions for how many students would answer each question correctly during a two-step process. First, predictions were made individually without an answer key. Second, experts had the opportunity to revise their predictions after being provided an answer key in a group discussion. We compared expert predictions with actual student performance using results from over 400 students spanning multiple courses and terms. We found no clear correlation between expert predictions of the DL and the measured DL from students. Some evidence shows that discussion during the second step made expert predictions closer to student performance. We suggest that, in determining the DL for conceptual questions, using predictions of the DL by experts who have taught the course is not a valid route. The findings in this paper can be applied to assessments in both in-person, hybrid, and online settings and is applicable to subject matter beyond materials science.  more » « less
Award ID(s):
2135190
NSF-PAR ID:
10352599
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
ASEE Annual Conference proceedings
ISSN:
1524-4644
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. null ; null ; null ; null (Ed.)
    We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM. 
    more » « less
  2. We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult and time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM. 
    more » « less
  3. Evidence has shown that facilitating student-centered learning (SCL) in STEM classrooms enhances student learning and satisfaction [1]–[3]. However, despite increased support from educational and government bodies to incorporate SCL practices [1], minimal changes have been made in undergraduate STEM curriculum [4]. Faculty often teach as they were taught, relying heavily on traditional lecture-based teaching to disseminate knowledge [4]. Though some faculty express the desire to improve their teaching strategies, they feel limited by a lack of time, training, and incentives [4], [5]. To maximize student learning while minimizing instructor effort to change content, courses can be designed to incorporate simpler, less time-consuming SCL strategies that still have a positive impact on student experience. In this paper, we present one example of utilizing a variety of simple SCL strategies throughout the design and implementation of a 4-week long module. This module focused on introductory tissue engineering concepts and was designed to help students learn foundational knowledge within the field as well as develop critical technical skills. Further, the module sought to develop important professional skills such as problem-solving, teamwork, and communication. During module design and implementation, evidence-based SCL teaching strategies were applied to ensure students developed important knowledge and skills within the short timeframe. Lectures featured discussion-based active learning exercises to encourage student engagement and peer collaboration [6]–[8]. The module was designed using a situated perspective, acknowledging that knowing is inseparable from doing [9], and therefore each week, the material taught in the two lecture sessions was directly applied to that week’s lab to reinforce students’ conceptual knowledge through hands-on activities and experimental outcomes. Additionally, the majority of assignments served as formative assessments to motivate student performance while providing instructors with feedback to identify misconceptions and make real-time module improvements [10]–[12]. Students anonymously responded to pre- and post-module surveys, which focused on topics such as student motivation for enrolling in the module, module expectations, and prior experience. Students were also surveyed for student satisfaction, learning gains, and graduate student teaching team (GSTT) performance. Data suggests a high level of student satisfaction, as most students’ expectations were met, and often exceeded. Students reported developing a deeper understanding of the field of tissue engineering and learning many of the targeted basic lab skills. In addition to hands-on skills, students gained confidence to participate in research and an appreciation for interacting with and learning from peers. Finally, responses with respect to GSTT performance indicated a perceived emphasis on a learner-centered and knowledge/community-centered approaches over assessment-centeredness [13]. Overall, student feedback indicated that SCL teaching strategies can enhance student learning outcomes and experience, even over the short timeframe of this module. Student recommendations for module improvement focused primarily on modifying the lecture content and laboratory component of the module, and not on changing the teaching strategies employed. The success of this module exemplifies how instructors can implement similar strategies to increase student engagement and encourage in-depth discussions without drastically increasing instructor effort to re-format course content. Introduction. 
    more » « less
  4. Since the 2014 high-profile meta-analysis of undergraduate STEM courses, active learning has become a standard in higher education pedagogy. One way to provide active learning is through the flipped classroom. However, finding suitable pre-class learning activities to improve student preparation and the subsequent classroom environment, including student engagement, can present a challenge in the flipped modality. To address this challenge, adaptive learning lessons were developed for pre-class learning for a course in Numerical Methods. The lessons would then be used as part of a study to determine their cognitive and affective impacts. Before the study could be started, it involved constructing well-thought-out adaptive lessons. This paper discusses developing, refining, and revising the adaptive learning platform (ALP) lessons for pre-class learning in a Numerical Methods flipped course. In a prior pilot study at a large public southeastern university, the first author had developed ALP lessons for the pre-class learning for four (Nonlinear Equations, Matrix Algebra, Regression, Integration) of the eight topics covered in a Numerical Methods course. In the current follow-on study, the first author and two other instructors who teach Numerical Methods, one from a large southwestern urban university and another from an HBCU, collaborated on developing the adaptive lessons for the whole course. The work began in Fall 2020 by enumerating the various chapters and breaking each one into individual lessons. Each lesson would include five sections (introduction, learning objectives, video lectures, textbook content, assessment). The three instructors met semi-monthly to discuss the content that would form each lesson. The main discussion of the meetings centered on what a student would be expected to learn before coming to class, choosing appropriate content, agreeing on prerequisites, and choosing and making new assessment questions. Lessons were then created by the first author and his student team using a commercially available platform called RealizeIT. The content was tested by learning assistants and instructors. It is important to note that significant, if not all, parts of the content, such as videos and textbook material, were available through previously done work. The new adaptive lessons and the revised existing ones were completed in December 2020. The adaptive lessons were tested for implementation in Spring 2021 at the first author's university and made 15% of the students' grade calculation. Questions asked by students during office hours, on the LMS discussion board, and via emails while doing the lessons were used to update content, clarify questions, and revise hints offered by the platform. For example, all videos in the ALP lessons were updated to HD quality based on student feedback. In addition, comments from the end-of-semester surveys conducted by an independent assessment analyst were collated to revise the adaptive lessons further. Examples include changing the textbook content format from an embedded PDF file to HTML to improve quality and meet web accessibility standards. The paper walks the reader through the content of a typical lesson. It also shows the type of data collected by the adaptive learning platform via three examples of student interactions with a single lesson. 
    more » « less
  5. Since the 2014 high-profile meta-analysis of undergraduate STEM courses, active learning has become a standard in higher education pedagogy. One way to provide active learning is through the flipped classroom. However, finding suitable pre-class learning activities to improve student preparation and the subsequent classroom environment, including student engagement, can present a challenge in the flipped modality. To address this challenge, adaptive learning lessons were developed for pre-class learning for a course in Numerical Methods. The lessons would then be used as part of a study to determine their cognitive and affective impacts. Before the study could be started, it involved constructing well-thought-out adaptive lessons. This paper discusses developing, refining, and revising the adaptive learning platform (ALP) lessons for pre-class learning in a Numerical Methods flipped course. In a prior pilot study at a large public southeastern university, the first author had developed ALP lessons for the pre-class learning for four (Nonlinear Equations, Matrix Algebra, Regression, Integration) of the eight topics covered in a Numerical Methods course. In the current follow-on study, the first author and two other instructors who teach Numerical Methods, one from a large southwestern urban university and another from an HBCU, collaborated on developing the adaptive lessons for the whole course. The work began in Fall 2020 by enumerating the various chapters and breaking each one into individual lessons. Each lesson would include five sections (introduction, learning objectives, video lectures, textbook content, assessment). The three instructors met semi-monthly to discuss the content that would form each lesson. The main discussion of the meetings centered on what a student would be expected to learn before coming to class, choosing appropriate content, agreeing on prerequisites, and choosing and making new assessment questions. Lessons were then created by the first author and his student team using a commercially available platform called RealizeIT. The content was tested by learning assistants and instructors. It is important to note that significant, if not all, parts of the content, such as videos and textbook material, were available through previously done work. The new adaptive lessons and the revised existing ones were completed in December 2020. The adaptive lessons were tested for implementation in Spring 2021 at the first author's university and made 15% of the students' grade calculation. Questions asked by students during office hours, on the LMS discussion board, and via emails while doing the lessons were used to update content, clarify questions, and revise hints offered by the platform. For example, all videos in the ALP lessons were updated to HD quality based on student feedback. In addition, comments from the end-of-semester surveys conducted by an independent assessment analyst were collated to revise the adaptive lessons further. Examples include changing the textbook content format from an embedded PDF file to HTML to improve quality and meet web accessibility standards. The paper walks the reader through the content of a typical lesson. It also shows the type of data collected by the adaptive learning platform via three examples of student interactions with a single lesson. 
    more » « less