skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Title: Analyzing Student Experiences and Career Pathways for Healthcare Student Volunteers Participating in a Disaster Response Drill: A Mixed-Methods Study
CTE Journal . Fall2023, Vol. 11 Issue 2, p26-41.  more » « less
Award ID(s):
2133391
PAR ID:
10487178
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
CTE Journal
Date Published:
Journal Name:
CTE Journal
Volume:
11
Issue:
2
ISSN:
2327-0160
Page Range / eLocation ID:
26-41
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. We propose a novel knowledge distillation (KD) method to selectively instill teacher knowledge into a student model motivated by situations where the student’s capacity is significantly smaller than that of the teachers. In vanilla KD, the teacher primarily sets a predictive target for the student to follow, and we posit that this target is overly optimistic due to the student’s lack of capacity. We develop a novel scaffolding scheme where the teacher, in addition to setting a predictive target, also scaffolds the student’s prediction by censoring hard-to-learn examples. The student model utilizes the same information as the teacher’s soft-max predictions as inputs, and in this sense, our proposal can be viewed as a natural variant of vanilla KD. We show on synthetic examples that censoring hard-examples leads to smoothening the student’s loss landscape so that the student encounters fewer local minima. As a result, it has good generalization properties. Against vanilla KD, we achieve improved performance and are comparable to more intrusive techniques that leverage feature matching on benchmark datasets. 
    more » « less