Measuring the level of institutional capacity for grantsmanship within higher education informs administrators about the needs of their organization and where resources and institutional supports can be implemented to support faculty and staff. Receiving grant funding can lead to implementing cutting-edge programming and research support, which could improve the quality of education provided and, ultimately, student retention. While conducting an institutional capacity needs assessment is crucial for making data-informed decisions, there is a significant gap in institutional capacity research; specifically, there is no valid and reliable assessment tool designed to measure institutional capacity for grantsmanship. The present study aims to develop an assessment tool for higher education institutions to evaluate support systems and identify the needs of their faculty and administrators for grant writing efforts. The current study used a mixed-method approach over three phases to understand the indicators behind measuring institutional capacity for grantsmanship. We developed six reliable scales—promoting grant proposal writing, proposal writing (for faculty), proposal writing (for administrators), proposal writing (all respondents), submitting grant proposals, implementing grant activities, and managing awards. This study contributes to our understanding of institutional capacity and produced a reliable assessment tool to support grantsmanship. 
                        more » 
                        « less   
                    
                            
                            Biosciences Proposal Bootcamp: Structured peer and faculty feedback improves trainees’ proposals and grantsmanship self-efficacy
                        
                    
    
            Grant writing is an essential skill to develop for academic and other career success but providing individual feedback to large numbers of trainees is challenging. In 2014, we launched the Stanford Biosciences Grant Writing Academy to support graduate students and postdocs in writing research proposals. Its core program is a multi-week Proposal Bootcamp designed to increase the feedback writers receive as they develop and refine their proposals. The Proposal Bootcamp consisted of two-hour weekly meetings that included mini lectures and peer review. Bootcamp participants also attended faculty review workshops to obtain faculty feedback. Postdoctoral trainees were trained and hired as course teaching assistants and facilitated weekly meetings and review workshops. Over the last six years, the annual Bootcamp has provided 525 doctoral students and postdocs with multi-level feedback (peer and faculty). Proposals from Bootcamp participants were almost twice as likely to be funded than proposals from non-Bootcamp trainees. Overall, this structured program provided opportunities for feedback from multiple peer and faculty reviewers, increased the participants’ confidence in developing and submitting research proposals, while accommodating a large number of participants. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1714723
- PAR ID:
- 10281078
- Editor(s):
- Cameron, Carrie
- Date Published:
- Journal Name:
- PLOS ONE
- Volume:
- 15
- Issue:
- 12
- ISSN:
- 1932-6203
- Page Range / eLocation ID:
- e0243973
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Writing winning proposals for funding research is an essential skill for doctoral students in the social sciences. Still, most anthropology programs lack formal instruction on this, relying instead on informal mentorship. To advance this, we evaluated the Value Proposition framework in teaching anthropology Ph.D. students to write proposals. Our findings from the feedback from students and faculty in the NSF-funded Cultural Anthropology Methods Program (CAMP) offer insights for using this framework to bridge the proposal-writing gap in the training of cultural anthropologists.more » « less
- 
            Peer review of grant proposals is critical to the National Science Foundation (NSF) funding process for STEM disciplinary and education research. Despite this, scholars receive little training in effective and constructive review of proposals beyond definitions of review criteria and an overview of strategies to avoid bias and communicate clearly. Senior researchers often find that their reviewing skills improve and develop over time, but variations in reviewer starting points can have a negative impact on the value of reviews for their intended audiences of program officers, who make funding recommendations, and principal investigators, who drive the research or want to improve their proposals. Building on the journal review component of the Engineering Education Research Peer Review Training (EER PERT) project, which is designed to develop EER scholars’ peer review skills through mentored reviewing experiences, this paper describes a program designed to provide professional development for proposal reviewing and provides initial evaluation results.more » « less
- 
            Peer review of grant proposals is critical to the National Science Foundation (NSF) funding process for STEM disciplinary and education research. Despite this, scholars receive little training in effective and constructive review of proposals beyond definitions of review criteria and an overview of strategies to avoid bias and communicate clearly. Senior researchers often find that their reviewing skills improve and develop over time, but variations in reviewer starting points can have a negative impact on the value of reviews for their intended audiences of program officers, who make funding recommendations, and principal investigators, who drive the research or want to improve their proposals. Building on the journal review component of the Engineering Education Research Peer Review Training (EER PERT) project, which is designed to develop EER scholars’ peer review skills through mentored reviewing experiences, this paper describes a program designed to provide professional development for proposal reviewing and provides initial evaluation results.more » « less
- 
            You develop the prototype for a new learning strategy, and want to test it in class or across institutions. You identify an NSF program that supports proposals for the idea, and then what? What goes through the minds of reviewers once a proposal is submitted? What prompts one proposal to be recommended for funding while another is declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand a PI’s idea, identify its merit, and value a PI’s vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of a proposal to award or decline, touching on elements of a good review, NSF intellectual merit and broader impact criteria, elements of a good proposal, and volunteering to review proposals. Participants gain insight into writing a good review and improving one’s own proposal writing. The interactive workshop leads participants through each topic by introducing related issues, engaging participants in group exercises designed to explore and share their understanding of the issues, and providing “expert” opinion on these issues. Examples include funded and non-funded projects and a Top Ten List of Do’s and Don’ts. One night of lodging and workshop registration fees will be covered by an NSF grant for the first 25 participants who submit their own one-page proposal summary to the organizers one month prior to the workshop and participate fully in the workshop. For further information see - https://people.cs.clemson.edu/~etkraem/UPCSEd/more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
 
                                    