Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available September 24, 2026
-
Free, publicly-accessible full text available June 25, 2026
-
This theory paper focuses on a research methodology, using an autoethnographic approach to reflect on the use of cognitive interviewing (CI) as a method of increasing the quality and validity of questionnaires in pre-validation design and development stages. We first provide a brief review of cognitive interviewing, sometimes called “cognitive think-aloud interviewing” or “think-aloud interviewing,” before presenting a summary of two studies conducted by the authors that used CI. Differences between these two studies are discussed as comparative cases and advice is given to scholars considering the use of CI in their own research. While this paper is not an explicit guide to conducting CI, we do intend to provide advice and wisdom for researchers who are unfamiliar with CI as a method, grounded in our experience with the method. This paper is written with a particular focus on the use of CI in engineering education research (EER) but may be more broadly applicable to other social sciences domains.more » « less
-
Drawing from the results of this study and a review of the literature on graduate student stressors, we developed in Year 2 the Stressors for Doctoral Students Questionnaire for Engineering (SDSQ-E) and administered it twice, in fall 2022 and in spring 2023. The SDSQ-E measures the severity and frequency of stressors including advisor-related stressors, class-taking stressors, research or laboratory stressors, campus life and financial stressors, and identity-related or microaggression-related stressors. We present a description of our project and updates on its progress in its second year, including survey results from our 2022-2023 data collection.more » « less
-
Peer review of grant proposals is critical to the National Science Foundation (NSF) funding process for STEM disciplinary and education research. Despite this, scholars receive little training in effective and constructive review of proposals beyond definitions of review criteria and an overview of strategies to avoid bias and communicate clearly. Senior researchers often find that their reviewing skills improve and develop over time, but variations in reviewer starting points can have a negative impact on the value of reviews for their intended audiences of program officers, who make funding recommendations, and principal investigators, who drive the research or want to improve their proposals. Building on the journal review component of the Engineering Education Research Peer Review Training (EER PERT) project, which is designed to develop EER scholars’ peer review skills through mentored reviewing experiences, this paper describes a program designed to provide professional development for proposal reviewing and provides initial evaluation results.more » « less
-
Peer review of grant proposals is critical to the National Science Foundation (NSF) funding process for STEM disciplinary and education research. Despite this, scholars receive little training in effective and constructive review of proposals beyond definitions of review criteria and an overview of strategies to avoid bias and communicate clearly. Senior researchers often find that their reviewing skills improve and develop over time, but variations in reviewer starting points can have a negative impact on the value of reviews for their intended audiences of program officers, who make funding recommendations, and principal investigators, who drive the research or want to improve their proposals. Building on the journal review component of the Engineering Education Research Peer Review Training (EER PERT) project, which is designed to develop EER scholars’ peer review skills through mentored reviewing experiences, this paper describes a program designed to provide professional development for proposal reviewing and provides initial evaluation results.more » « less
-
This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields.more » « less
-
This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields.more » « less
An official website of the United States government

Full Text Available