skip to main content


This content will become publicly available on June 26, 2024

Title: Overlooked, Underlying: Understanding tacit criteria of proposal reviewing during a mock panel review
This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields.  more » « less
Award ID(s):
2037807
NSF-PAR ID:
10472889
Author(s) / Creator(s):
; ; ; ; ; ;
Publisher / Repository:
ASEE
Date Published:
Journal Name:
Proceedings ASEE annual conference
ISSN:
0190-1052
Subject(s) / Keyword(s):
["Peer Review","Transformative Learning Theory","Grant Proposal Review","Case Study"]
Format(s):
Medium: X
Location:
Baltimore, MD
Sponsoring Org:
National Science Foundation
More Like this
  1. You develop the prototype for a new learning strategy, and want to test it in class or across institutions. You identify an NSF program that supports proposals for the idea, and then what? What goes through the minds of reviewers once a proposal is submitted? What prompts one proposal to be recommended for funding while another is declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand a PI’s idea, identify its merit, and value a PI’s vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of a proposal to award or decline, touching on elements of a good review, NSF intellectual merit and broader impact criteria, elements of a good proposal, and volunteering to review proposals. Participants gain insight into writing a good review and improving one’s own proposal writing. The interactive workshop leads participants through each topic by introducing related issues, engaging participants in group exercises designed to explore and share their understanding of the issues, and providing “expert” opinion on these issues. Examples include funded and non-funded projects and a Top Ten List of Do’s and Don’ts. One night of lodging and workshop registration fees will be covered by an NSF grant for the first 25 participants who submit their own one-page proposal summary to the organizers one month prior to the workshop and participate fully in the workshop. For further information see - https://people.cs.clemson.edu/~etkraem/UPCSEd/ 
    more » « less
  2. Lam, Hon-Ming (Ed.)
    Peer review, commonly used in grant funding decisions, relies on scientists’ ability to evaluate research proposals’ quality. Such judgments are sometimes beyond reviewers’ discriminatory power and could lead to a reliance on subjective biases, including preferences for lower risk, incremental projects. However, peer reviewers’ risk tolerance has not been well studied. We conducted a cross-sectional experiment of peer reviewers’ evaluations of mock primary reviewers’ comments in which the level and sources of risks and weaknesses were manipulated. Here we show that proposal risks more strongly predicted reviewers’ scores than proposal strengths based on mock proposal evaluations. Risk tolerance was not predictive of scores but reviewer scoring leniency was predictive of overall and criteria scores. The evaluation of risks dominates reviewers’ evaluation of research proposals and is a source of inter-reviewer variability. These results suggest that reviewer scoring variability may be attributed to the interpretation of proposal risks, and could benefit from intervention to improve the reliability of reviews. Additionally, the valuation of risk drives proposal evaluations and may reduce the chances that risky, but highly impactful science, is supported. 
    more » « less
  3. You develop a plan for testing the prototype for a new learning strategy in your class or across institutions. How can you ensure that your plan is clearly understood by reviewers and the managing NSF program officer? What goes through the reviewer's mind once a proposal is submitted? What prompts one proposal to be recommended for funding but another declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand an idea, identify its merit, and value a PI's vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of proposal to award or decline, touching on NSF intellectual merit and broader impact criteria, mapping the project pipeline to appropriate evaluation. Participants gain insight into writing a good review and improving one's own proposal writing. For further information and travel support see: https://people.cs.clemson.edu/~etkraem/UPCSEd/. Laptops recommended. 
    more » « less
  4. This is the first of a series of studies that explore the relationship between disciplinary background and the weighting of various elements of a manuscript in peer reviewers’ determination of publication recommendations. Research questions include: (1) To what extent are tacit criteria for determining quality or value of EER manuscripts influenced by reviewers’ varied disciplinary backgrounds and levels of expertise? and (2) To what extent does mentored peer review professional development influence reviewers’ EER manuscript evaluations? Data were collected from 27 mentors and mentees in a peer review professional development program. Participants reviewed the same two manuscripts, using a form to identify strengths, weaknesses, and recommendations. Responses were coded by two researchers (70% IRR). Our findings suggest that disciplinary background influences reviewers’ evaluation of EER manuscripts. We also found evidence that professional development can improve reviewers’ understanding of EER disciplinary conventions. Deeper understanding of the epistemological basis for manuscript reviews may reveal ways to strengthen professional preparation in engineering education as well as other disciplines. 
    more » « less
  5. This is the first of a series of studies that explore the relationship between disciplinary background and the weighting of various elements of a manuscript in peer reviewers’ determination of publication recommendations. Research questions include: (1) To what extent are tacit criteria for determining quality or value of EER manuscripts influenced by reviewers’ varied disciplinary backgrounds and levels of expertise? and (2) To what extent does mentored peer review professional development influence reviewers’ EER manuscript evaluations? Data were collected from 27 mentors and mentees in a peer review professional development program. Participants reviewed the same two manuscripts, using a form to identify strengths, weaknesses, and recommendations. Responses were coded by two researchers (70% IRR). Our findings suggest that disciplinary background influences reviewers’ evaluation of EER manuscripts. We also found evidence that professional development can improve reviewers’ understanding of EER disciplinary conventions. Deeper understanding of the epistemological basis for manuscript reviews may reveal ways to strengthen professional preparation in engineering education as well as other disciplines. 
    more » « less