skip to main content


Title: Evaluating Oral Exams in Large Undergraduate Engineering Courses
While studies have shown that oral exams are a valuable method of assessment, their use has been limited due to concerns about scalability, examiner bias, and student anxiety. This paper presents preliminary results on incorporating oral exams into two large undergraduate engineering courses, examining the potential viability of these assessment strategies. This work was done when the courses were offered remotely due to COVID-19, but the results offer valuable insights that could carry over to in-person instruction as well.  more » « less
Award ID(s):
2044472
NSF-PAR ID:
10311805
Author(s) / Creator(s):
; ; ; ; ;
Date Published:
Journal Name:
IEEE-AP-S URSI 2021
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. This project aims to enhance students’ learning in foundational engineering courses through oral exams based on the research conducted at the University of California San Diego. The adaptive dialogic nature of oral exams provides instructors an opportunity to better understand students’ thought processes, thus holding promise for improving both assessments of conceptual mastery and students’ learning attitudes and strategies. However, the issues of oral exam reliability, validity, and scalability have not been fully addressed. As with any assessment format, careful design is needed to maximize the benefits of oral exams to student learning and minimize the potential concerns. Compared to traditional written exams, oral exams have a unique design space, which involves a large range of parameters, including the type of oral assessment questions, grading criteria, how oral exams are administered, how questions are communicated and presented to the students, how feedback were provided, and other logistical perspectives such as weight of oral exam in overall course grade, frequency of oral assessment, etc. In order to address the scalability for high enrollment classes, key elements of the project are the involvement of the entire instructional team (instructors and teaching assistants). Thus the project will create a new training program to prepare faculty and teaching assistants to administer oral exams that include considerations of issues such as bias and students with disabilities. The purpose of this study is to create a framework to integrate oral exams in core undergraduate engineering courses, complementing existing assessment strategies by (1) creating a guideline to optimize the oral exam design parameters for the best students learning outcomes; and (2) Create a new training program to prepare faculty and teaching assistants to administer oral exams. The project will implement an iterative design strategy using an evidence-based approach of evaluation. The effectiveness of the oral exams will be evaluated by tracking student improvements on conceptual questions across consecutive oral exams in a single course, as well as across other courses. Since its start in January 2021, the project is well underway. In this poster, we will present a summary of the results from year 1: (1) exploration of the oral exam design parameters, and its impact in students’ engagement and perception of oral exams towards learning; (2) the effectiveness of the newly developed instructor and teaching assistants training programs (3) The development of the evaluation instruments to gauge the project success; (4) instructors and teaching assistants experience and perceptions. 
    more » « less
  2. This work-in-progress paper presents an innovative practice of using oral exams to maintain academic integrity and promote student engagement in large-enrollment engineering courses during remote instruction. With the abrupt and widespread transition to distance learning and assessment brought on by the COVID-19 pandemic, there has been a registered upsurge in academic integrity violations globally. To address the challenge of compromised integrity, in the winter quarter of 2021 we have implemented oral exams across six mostly high-enrollment mechanical and electrical engineering undergraduate courses. We present our oral exam design parameters in each of the courses and discuss how oral exams relate to academic integrity, student engagement, stress, and implicit bias. We also address the challenge of scalability, as most of our oral exams were implemented in large classes, where academic integrity and student-instructor disconnection have generally gotten disproportionately worse during remote learning. Our survey results indicate that oral exams have positively contributed to academic integrity in our courses. Based on our preliminary study and experiences, we expect oral exams can be effectively leveraged to hinder cheating and foster academic honesty in students, even when in-person instruction and assessment resumes. 
    more » « less
  3. This paper presents an innovative approach to improve engineering students’ problem-solving skills by implementing think-aloud exercises. Sometimes engineering students claim they do not know where to start with the problem-solving process, or they are not sure how to proceed to the next steps when they get stuck. A systematic training that focuses on the problem-solving process and the justification of each step could help. Think-aloud techniques help make the invisible mental processes visible to learners. Engineering think-aloud technique engages students and helps them make their way through a solving process step-by-step, reasoning along with them. In this study, a multiple faceted systematic approach that integrates think-aloud exercises through video assignments and oral exams were developed and implemented in two pilot engineering classes. We present our think-aloud exercises and oral exams structures in each of the courses and their impacts on students' learning outcomes, and students’ perceptions towards the pedagogical approach. Both quantitative and qualitative results show that the think-aloud exercise assignments and oral exams enhance students’ problem-solving skills and promote learning. 
    more » « less
  4. ABSTRACT The article documents students’ experiences with the shift online at the onset of the COVID-19 pandemic and provides informed recommendations to STEM instructors regarding academic integrity and student stress. Over 500 students were surveyed on these topics, including an open-ended question. Students experienced more stress and perceived a greater workload in online courses and therefore preferred in-person courses overall. Personal awareness of cheating during online exams is positively correlated with the proportion of cheating a student perceives. Fear of getting caught is the best cheating deterrent while getting a better grade makes cheating most enticing. Randomization of questions and answer choices is perceived as a highly effective tool to reduce cheating and is reported as the least stress-inducing method. Inability to backtrack and time limits cause students the most stress. Students report that multiple choice questions are the least effective question type to discourage cheating and oral exam questions cause the most stress. Use of camera and lockdown browser or being video- and audio- recorded caused the majority of student stress. Yet, nearly 60% agree that the combination of camera and lockdown browser is an effective deterrent. Recommendations: (i) Be transparent regarding academic dishonesty detection methods and penalties. (ii) Use online invigilating tools. (iii) Synchronize exams and (iv) randomize exam questions. (v) Allow backtracking. (vi) Avoid converting in-person exams to online exams; instead, explore new ways of designing exams for the online environment. 
    more » « less
  5. Abstract Background and objectives

    Universities throughout the USA increasingly offer undergraduate courses in evolutionary medicine (EvMed), which creates a need for pedagogical resources. Several resources offer course content (e.g. textbooks) and a previous study identified EvMed core principles to help instructors set learning goals. However, assessment tools are not yet available. In this study, we address this need by developing an assessment that measures students’ ability to apply EvMed core principles to various health-related scenarios.

    Methodology

    The EvMed Assessment (EMA) consists of questions containing a short description of a health-related scenario followed by several likely/unlikely items. We evaluated the assessment’s validity and reliability using a variety of qualitative (expert reviews and student interviews) and quantitative (Cronbach’s α and classical test theory) methods. We iteratively revised the assessment through several rounds of validation. We then administered the assessment to undergraduates in EvMed and Evolution courses at multiple institutions.

    Results

    We used results from the pilot to create the EMA final draft. After conducting quantitative validation, we deleted items that failed to meet performance criteria and revised items that exhibited borderline performance. The final version of the EMA consists of six core questions containing 25 items, and five supplemental questions containing 20 items.

    Conclusions and implications

    The EMA is a pedagogical tool supported by a wide range of validation evidence. Instructors can use it as a pre/post measure of student learning in an EvMed course to inform curriculum revision, or as a test bank to draw upon when developing in-class assessments, quizzes or exams.

     
    more » « less