skip to main content


Title: Narratives and Evaluation: How to Write Competitive NSF CS Education Proposals
You develop a plan for testing the prototype for a new learning strategy in your class or across institutions. How can you ensure that your plan is clearly understood by reviewers and the managing NSF program officer? What goes through the reviewer's mind once a proposal is submitted? What prompts one proposal to be recommended for funding but another declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand an idea, identify its merit, and value a PI's vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of proposal to award or decline, touching on NSF intellectual merit and broader impact criteria, mapping the project pipeline to appropriate evaluation. Participants gain insight into writing a good review and improving one's own proposal writing. For further information and travel support see: https://people.cs.clemson.edu/~etkraem/UPCSEd/. Laptops recommended.  more » « less
Award ID(s):
1646691
NSF-PAR ID:
10125031
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
ACM SIGCSE '19 Proceedings of the 50th ACM Technical Symposium on Computer Science Education
Page Range / eLocation ID:
1234 to 1235
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. You develop the prototype for a new learning strategy, and want to test it in class or across institutions. You identify an NSF program that supports proposals for the idea, and then what? What goes through the minds of reviewers once a proposal is submitted? What prompts one proposal to be recommended for funding while another is declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand a PI’s idea, identify its merit, and value a PI’s vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of a proposal to award or decline, touching on elements of a good review, NSF intellectual merit and broader impact criteria, elements of a good proposal, and volunteering to review proposals. Participants gain insight into writing a good review and improving one’s own proposal writing. The interactive workshop leads participants through each topic by introducing related issues, engaging participants in group exercises designed to explore and share their understanding of the issues, and providing “expert” opinion on these issues. Examples include funded and non-funded projects and a Top Ten List of Do’s and Don’ts. One night of lodging and workshop registration fees will be covered by an NSF grant for the first 25 participants who submit their own one-page proposal summary to the organizers one month prior to the workshop and participate fully in the workshop. For further information see - https://people.cs.clemson.edu/~etkraem/UPCSEd/ 
    more » « less
  2. This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields. 
    more » « less
  3. This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields. 
    more » « less
  4. Anti-Virus Software is designed to keep your computer and other devices safe from viruses and other malware. Some popular free softwares that many people use are Norton, Kaspersky, Avira, and even more. If your device does get a virus, the software is supposed to isolate that file to prevent the file from infecting the rest of your device. To ensure that you are protected to the fullest extent, you must keep the anti-virus software up to date. But during that time that the anti-virus is getting updated, is it still keeping you safe? Many viruses get created every day, let alone every hour, so after your anti-virus is updated it’s already putting you at risk because those new viruses may not be in the update. All anti-virus software companies all claim to be the best frontline protectors, but this paper will see what anti-virus software will give you the best protection from viruses. 
    more » « less
  5. This dataset includes anonymized interview data collected in the Yukon-Kuskokwim Delta in August 2022. These interviews were designed to capture end-users' perceptions and experiences with their water infrastructure systems. Interview questions included, for example: Can you tell me how you use water in your household?; What do you like about your water or water system?; What are some of the concerns/challenges you deal with in your household water system?; Do you worry about whether your water is safe to drink? 10 semi-structured interviews with 12 end-users are included. These interviews were conducted from August 2nd to August 8th, 2022. All interviews were conducted in-person. Interviews were recorded (with permission), transcribed, checked for quality, and anonymized. 
    more » « less