skip to main content

This content will become publicly available on June 1, 2023

Title: Towards an Adaptive Learning Module for Materials Science: Comparing Expert Predictions to Student Performance
The emphasis on conceptual learning and the development of adaptive instructional design are both emerging areas in science and engineering education. Instructors are writing their own conceptual questions to promote active learning during class and utilizing pools of these questions in assessments. For adaptive assessment strategies, these questions need to be rated based on difficulty level (DL). Historically DL has been determined from the performance of a suitable number of students. The research study reported here investigates whether instructors can save time by predicting DL of newly made conceptual questions without the need for student data. In this paper, we report on the development of one component in an adaptive learning module for materials science – specifically on the topic of crystallography. The summative assessment element consists of five DL scales and 15 conceptual questions This adaptive assessment directs students based on their previous performances and the DL of the questions. Our five expert participants are faculty members who have taught the introductory Materials Science course multiple times. They provided predictions for how many students would answer each question correctly during a two-step process. First, predictions were made individually without an answer key. Second, experts had the opportunity to revise more » their predictions after being provided an answer key in a group discussion. We compared expert predictions with actual student performance using results from over 400 students spanning multiple courses and terms. We found no clear correlation between expert predictions of the DL and the measured DL from students. Some evidence shows that discussion during the second step made expert predictions closer to student performance. We suggest that, in determining the DL for conceptual questions, using predictions of the DL by experts who have taught the course is not a valid route. The findings in this paper can be applied to assessments in both in-person, hybrid, and online settings and is applicable to subject matter beyond materials science. « less
Authors:
; ; ; ;
Award ID(s):
2135190
Publication Date:
NSF-PAR ID:
10352599
Journal Name:
ASEE Annual Conference proceedings
ISSN:
1524-4644
Sponsoring Org:
National Science Foundation
More Like this
  1. We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult andmore »time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.« less
  2. We reflect on our ongoing journey in the educational Cybersecurity Assessment Tools (CATS) Project to create two concept inventories for cybersecurity. We identify key steps in this journey and important questions we faced. We explain the decisions we made and discuss the consequences of those decisions, highlighting what worked well and what might have gone better. The CATS Project is creating and validating two concept inventories—conceptual tests of understanding—that can be used to measure the effectiveness of various approaches to teaching and learning cybersecurity. The Cybersecurity Concept Inventory (CCI) is for students who have recently completed any first course in cybersecurity; the Cybersecurity Curriculum Assessment (CCA) is for students who have recently completed an undergraduate major or track in cybersecurity. Each assessment tool comprises 25 multiple-choice questions (MCQs) of various difficulties that target the same five core concepts, but the CCA assumes greater technical background. Key steps include defining project scope, identifying the core concepts, uncovering student misconceptions, creating scenarios, drafting question stems, developing distractor answer choices, generating educational materials, performing expert reviews, recruiting student subjects, organizing workshops, building community acceptance, forming a team and nurturing collaboration, adopting tools, and obtaining and using funding. Creating effective MCQs is difficult andmore »time-consuming, and cybersecurity presents special challenges. Because cybersecurity issues are often subtle, where the adversarial model and details matter greatly, it is challenging to construct MCQs for which there is exactly one best but non-obvious answer. We hope that our experiences and lessons learned may help others create more effective concept inventories and assessments in STEM.« less
  3. Evidence has shown that facilitating student-centered learning (SCL) in STEM classrooms enhances student learning and satisfaction [1]–[3]. However, despite increased support from educational and government bodies to incorporate SCL practices [1], minimal changes have been made in undergraduate STEM curriculum [4]. Faculty often teach as they were taught, relying heavily on traditional lecture-based teaching to disseminate knowledge [4]. Though some faculty express the desire to improve their teaching strategies, they feel limited by a lack of time, training, and incentives [4], [5]. To maximize student learning while minimizing instructor effort to change content, courses can be designed to incorporate simpler, less time-consuming SCL strategies that still have a positive impact on student experience. In this paper, we present one example of utilizing a variety of simple SCL strategies throughout the design and implementation of a 4-week long module. This module focused on introductory tissue engineering concepts and was designed to help students learn foundational knowledge within the field as well as develop critical technical skills. Further, the module sought to develop important professional skills such as problem-solving, teamwork, and communication. During module design and implementation, evidence-based SCL teaching strategies were applied to ensure students developed important knowledge and skills withinmore »the short timeframe. Lectures featured discussion-based active learning exercises to encourage student engagement and peer collaboration [6]–[8]. The module was designed using a situated perspective, acknowledging that knowing is inseparable from doing [9], and therefore each week, the material taught in the two lecture sessions was directly applied to that week’s lab to reinforce students’ conceptual knowledge through hands-on activities and experimental outcomes. Additionally, the majority of assignments served as formative assessments to motivate student performance while providing instructors with feedback to identify misconceptions and make real-time module improvements [10]–[12]. Students anonymously responded to pre- and post-module surveys, which focused on topics such as student motivation for enrolling in the module, module expectations, and prior experience. Students were also surveyed for student satisfaction, learning gains, and graduate student teaching team (GSTT) performance. Data suggests a high level of student satisfaction, as most students’ expectations were met, and often exceeded. Students reported developing a deeper understanding of the field of tissue engineering and learning many of the targeted basic lab skills. In addition to hands-on skills, students gained confidence to participate in research and an appreciation for interacting with and learning from peers. Finally, responses with respect to GSTT performance indicated a perceived emphasis on a learner-centered and knowledge/community-centered approaches over assessment-centeredness [13]. Overall, student feedback indicated that SCL teaching strategies can enhance student learning outcomes and experience, even over the short timeframe of this module. Student recommendations for module improvement focused primarily on modifying the lecture content and laboratory component of the module, and not on changing the teaching strategies employed. The success of this module exemplifies how instructors can implement similar strategies to increase student engagement and encourage in-depth discussions without drastically increasing instructor effort to re-format course content. Introduction.« less
  4. Instructor-led presentation-based teaching mainly focuses on delivering content. Whereas student active presentations-based teaching approaches require students to take leadership in learning actions. Many teaching and learning strategies were adopted to foster active student participation during in-class learning activities. We developed the student presentation-based effective teaching (SPET) approach in 2014 to make student presentation activity the central element of learning challenging concepts. We have developed several versions to meet the need for teaching small classes (P. Tyagi, "Student Presentation Based Effective Teaching (SPET) Approach for Advanced Courses," in ASME IMECE 2016-66029, V005T06A026), large enrolment classes (P. Tyagi, "Student Presentation Based Teaching (SPET) Approach for Classes With Higher Enrolment," ASME IMECE 2018-88463, V005T07A035), and online teaching during COVID-19. (P. Tyagi, "Second Modified Student Presentation Based Effective Teaching (SPET) Method Tested in COVID-19 Affected Senior Level Mechanical Engineering Course," in ASME IMECE 2020-23615, V009T09A026). The SPET approach has successfully engaged students with varied interests and competence levels in the learning process. SPET approach has also made it possible to cover new topics such as training engineering students about positive intelligence skills to foster lifelong learning aptitude and doing engineering projects in a group setting. However, it was noted that many students who weremore »overwhelmed with parallel academic demands in other courses and different activities were underperforming via SPET-based learning strategies. SPET core functioning depends on the following steps: Step 1: Provide a set of conceptual and topical questions for students to answer individually after self-education from the recommended textbook or course material, Step-2: Group presentations are prepared by the prepared students for in-class discussion, Step-3: Group makes a presentation in class 1-2 weeks after the day of the assignment to seek instructor feedback and to do peer discussion. The instructor noted that students unfamiliar with the new concepts and terminologies in the SPET assignment struggled to respond to questions individually and contribute to the group discussion based on their presentation. Several motivated students who invested time in familiarizing new concepts and terminologies met or exceeded the expectations. However, a significant student population struggled. To alleviate this issue author has implemented a further improvement in SPET approach. This paper reports teaching experiments conducted in MECH 487 Photovoltaic Cells and Solar Thermal Energy System and MECH 462 Design of Energy Systems course. This improvement requires augmenting SPET with instructor-led concept familiarization discussion on the day of issuing the assignment or close to that; for this step instructor utilized exemplary student work from prior SPET-based teaching of the same course. In the survey, many students expressed their views about the improvement and reported introductory discussions were helpful and addressed several reservations and impediments students encountered. This paper will discuss the structure of the new improvement strategy and outcomes-including student feedback and comments.« less
  5. Community colleges provide an important pathway for many prospective engineering graduates, especially those from traditionally underrepresented groups. However, due to a lack of facilities, resources, student demand and/or local faculty expertise, the breadth and frequency of engineering course offerings is severely restricted at many community colleges. This in turn presents challenges for students trying to maximize their transfer eligibility and preparedness. Through a grant from the National Science Foundation Improving Undergraduate STEM Education program (NSF IUSE), three community colleges from Northern California collaborated to increase the availability and accessibility of a comprehensive lower-division engineering curriculum, even at small-to-medium sized community colleges. This was accomplished by developing resources and teaching strategies that could be employed in a variety of delivery formats (e.g., fully online, online/hybrid, flipped face-to-face, etc.), providing flexibility for local community colleges to leverage according to their individual needs. This paper focuses on the iterative development, testing, and refining of the resources for an introductory Materials Science course with 3-unit lecture and 1-unit laboratory components. This course is required as part of recently adopted statewide model associate degree curricula for transfer into Civil, Mechanical, Aerospace, and Manufacturing engineering bachelor’s degree programs at California State Universities. However, offering such amore »course is particularly challenging for many community colleges, because of a lack of adequate expertise and/or laboratory facilities and equipment. Consequently, course resources were developed to help mitigate these challenges by streamlining preparation for instructors new to teaching the course, as well as minimizing the face-to-face use of traditional materials testing equipment in the laboratory portion of the course. These same resources can be used to support online hybrid and other alternative (e.g., emporium) delivery approaches. After initial pilot implementation of the course during the Spring 2015 semester by the curriculum designer in a flipped student-centered format, these same resources were then implemented by an instructor who had never previously taught the course, at a different community college that did not have its own materials laboratory facilities. A single site visit was arranged with a nearby community college to afford students an opportunity to complete certain lab activities using traditional materials testing equipment. Lessons learned during this attempt were used to inform curriculum revisions, which were evaluated in a repeat offering the following year. In all implementations of the course, student surveys and interviews were used to determine students’ perceptions of the effectiveness of the course resources, student use of these resources, and overall satisfaction with the course. Additionally, student performance on objective assessments was compared with that of traditional lecture delivery of the course by the curriculum designer in prior years. During initial implementations of the course, results from these surveys and assessments revealed low levels of student satisfaction with certain aspects of the flipped approach and course resources, as well as reduced learning among students at the alternate institution. Subsequent modifications to the curriculum and delivery approach were successful in addressing most of these deficiencies.« less