skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Award ID contains: 1841783

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract In this article, we reflect on a decade of using the Kirkpatrick four‐level model to evaluate a multifaceted evaluation capacity building (ECB) initiative. Traditionally used to assess business training efforts, the Kirkpatrick model encourages evidence to be gathered at four levels: reaction, learning, behavior, and results. We adapted these levels to fit the context and information needs of the EvaluATE project, an ECB initiative funded by the National Science Foundation. As members of the external evaluation and project teams, throughout the article we describe how each level was modified and translated into evaluation questions. Our adapted Kirkpatrick levels are implementation and reach, satisfaction, learning, application, and impact. Using these adapted Kirkpatrick levels to ground our evaluation challenged us to integrate multiple data sources to tell a comprehensive story that served the information needs of the project team and the funder. Overall, we found the Kirkpatrick model to be practical, accessible, and flexible, allowing us to capture the multidimensional aspects of the ECB initiative. However, there are opportunities to enhance the utility of the Kirkpatrick framework by integrating other evaluation approaches, such as culturally responsive and equitable evaluation and principles‐focused evaluation. 
    more » « less
  2. Workplace-based learning provides participants with a valuable experiential learning opportunity to apply knowledge from the classroom to a real-world business or industry location. Yet despite calls to invest in or expand WBL opportunities in two-year institutions, no standard language or definitions appear to exist. Literature and research on WBL in two-year institutions is scant, and what is available suggests a lack of a common lexicon but does not address why it persists. This mixed-method study, using the Advanced Technological Education (ATE) program as its sample, addresses this gap and provides further insight into WBL language. Study results confirm that the language used to define and describe different types of WBL lacks standardization; ATE projects use various terms for WBL opportunities, with no clear pattern of characteristics distinguishing among types of WBL. The choice of terms for particular types of WBL opportunities is driven not by the opportunities' goals and characteristics but by external factors. The response to whether language in WBL matters also varied across the study population. This article concludes by reviewing the potential implications of these findings for research and practice and suggesting what can be done now to capture the impacts of workplace-based learning. 
    more » « less
  3. Abstract Background Billions of dollars are spent annually on grant-funded STEM (science, technology, engineering, and mathematics) education programs. These programs help students stay on track toward STEM careers when standard educational practices do not adequately prepare them for these careers. It is important to know that reliable and accurate student participation and completion data are being collected about these programs. This multiple case study investigates how student data are collected and reported for a national STEM education program in the United States, the National Science Foundation (NSF) Advanced Technological Education (ATE) program. Our overall aim is to provide insights to funding agencies, STEM education faculty, and others who are interested in addressing issues related to the collection and reporting of student participation and completion data within their own contexts. Emphasis is placed on the barriers encountered in collecting participation and completion data, particularly with regard to unduplicated participation counts and marketable credential data. The ATE program was selected for this study because there is already a mechanism (known as the ATE Survey) in place for annually collecting systematic data across all projects within the program. Results A multiple case study, including interviews of primary investigators, allowed for in-depth analysis of the ATE Survey’s point-in-time data on project-level participation in various activities, and for identification of the following barriers to tracking student-level data: lack of time and help to gather these data, lack of a consistent system for tracking students across different institutions, and a perceived lack of guidance from the funding agency about what data to track. We also saw that different data are needed from different projects to determine a project’s true impact. Defining “success” the same way across all projects is inadequate. Conclusions Although, due to the limited sample size, these findings cannot be generalized to the larger ATE population, they provide specific insights into the various barriers that projects encounter in collecting participation and completion data. 
    more » « less