You develop the prototype for a new learning strategy, and want to test it in class or across institutions. You identify an NSF program that supports proposals for the idea, and then what? What goes through the minds of reviewers once a proposal is submitted? What prompts one proposal to be recommended for funding while another is declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand a PI’s idea, identify its merit, and value a PI’s vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of a proposal to award or decline, touching on elements of a good review, NSF intellectual merit and broader impact criteria, elements of a good proposal, and volunteering to review proposals. Participants gain insight into writing a good review and improving one’s own proposal writing. The interactive workshop leads participants through each topic by introducing related issues, engaging participants in group exercises designed to explore and share their understanding of the issues, and providing “expert” opinion on these issues. Examples include funded and non-funded projects and a Top Ten List of Do’s and Don’ts. One night of lodging and workshop registration fees will be covered by an NSF grant for the first 25 participants who submit their own one-page proposal summary to the organizers one month prior to the workshop and participate fully in the workshop. For further information see - https://people.cs.clemson.edu/~etkraem/UPCSEd/
more »
« less
Narratives and Evaluation: How to Write Competitive NSF CS Education Proposals
You develop a plan for testing the prototype for a new learning strategy in your class or across institutions. How can you ensure that your plan is clearly understood by reviewers and the managing NSF program officer? What goes through the reviewer's mind once a proposal is submitted? What prompts one proposal to be recommended for funding but another declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand an idea, identify its merit, and value a PI's vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of proposal to award or decline, touching on NSF intellectual merit and broader impact criteria, mapping the project pipeline to appropriate evaluation. Participants gain insight into writing a good review and improving one's own proposal writing. For further information and travel support see: https://people.cs.clemson.edu/~etkraem/UPCSEd/. Laptops recommended.
more »
« less
- Award ID(s):
- 1646691
- PAR ID:
- 10125031
- Date Published:
- Journal Name:
- ACM SIGCSE '19 Proceedings of the 50th ACM Technical Symposium on Computer Science Education
- Page Range / eLocation ID:
- 1234 to 1235
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields.more » « less
-
This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields.more » « less
-
As proposal, subaward, award, and agreement volumes continue to grow at your institution, how can you strategize to support the size of your research infrastructure? How can you justify the expansion of your teams and talent? How do you adjust roles and workload to align with the growing portfolio? What technology considerations should you make to track this? This session will cover how to leverage data to make effective business decisions regarding resource needs and allocation methodology to meet growing demands, starting with your internal data and then looking externally. We will cover strategies for using data analytics to efficiently manage the size of your research enterprise and portfolios, as well as measure performance of pre-award and post-award functions across the grant lifecycle. Participants will understand how to analyze portfolios, beyond sheet volume, and analyze the various criteria that can be objectively evaluated to determine how to balance your sponsored programs portfolio. We will review common reporting and analytics tools and provide examples of critical data points to consider at both the central office and department level. Presented at the 2024 Research Analytics Summit in Albuquerque, NMmore » « less
-
Discover how to transform standard research award/proposal reports into dynamic visualizations of your research collaborations using Microsoft Power BI and custom visuals. This session guides you through creating interactive networks to visualize your research community. While we'll touch on network analysis, detailed statistical discussions are beyond the scope of this session. Participants are encouraged to bring a laptop and their data, although dummy data will be available. We'll cover data loading and transformation in Microsoft Power BI, employing custom visuals for network generation, and we will explore practical applications of research networks. Please download Microsoft Power BI Desktop beforehand if you plan to follow along, data can be provided or use your own! Presented at the 2024 Research Analytics Summit in Albuquerque, NMmore » « less
An official website of the United States government

