skip to main content


Title: Narratives and Evaluation: How to Write Competitive NSF CS Education Proposals
You develop a plan for testing the prototype for a new learning strategy in your class or across institutions. How can you ensure that your plan is clearly understood by reviewers and the managing NSF program officer? What goes through the reviewer's mind once a proposal is submitted? What prompts one proposal to be recommended for funding but another declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand an idea, identify its merit, and value a PI's vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of proposal to award or decline, touching on NSF intellectual merit and broader impact criteria, mapping the project pipeline to appropriate evaluation. Participants gain insight into writing a good review and improving one's own proposal writing. For further information and travel support see: https://people.cs.clemson.edu/~etkraem/UPCSEd/. Laptops recommended.  more » « less
Award ID(s):
1646691
NSF-PAR ID:
10125031
Author(s) / Creator(s):
; ; ; ;
Date Published:
Journal Name:
ACM SIGCSE '19 Proceedings of the 50th ACM Technical Symposium on Computer Science Education
Page Range / eLocation ID:
1234 to 1235
Format(s):
Medium: X
Sponsoring Org:
National Science Foundation
More Like this
  1. You develop the prototype for a new learning strategy, and want to test it in class or across institutions. You identify an NSF program that supports proposals for the idea, and then what? What goes through the minds of reviewers once a proposal is submitted? What prompts one proposal to be recommended for funding while another is declined? Close examination of the panel review process can inform proposal writing and ensure that reviewers will understand a PI’s idea, identify its merit, and value a PI’s vision of how the work will broaden participation in STEM education. This workshop steps through the NSF proposal review process from submission of a proposal to award or decline, touching on elements of a good review, NSF intellectual merit and broader impact criteria, elements of a good proposal, and volunteering to review proposals. Participants gain insight into writing a good review and improving one’s own proposal writing. The interactive workshop leads participants through each topic by introducing related issues, engaging participants in group exercises designed to explore and share their understanding of the issues, and providing “expert” opinion on these issues. Examples include funded and non-funded projects and a Top Ten List of Do’s and Don’ts. One night of lodging and workshop registration fees will be covered by an NSF grant for the first 25 participants who submit their own one-page proposal summary to the organizers one month prior to the workshop and participate fully in the workshop. For further information see - https://people.cs.clemson.edu/~etkraem/UPCSEd/ 
    more » « less
  2. This research paper study was situated within a peer review mentoring program in which novice reviewers were paired with mentors who are former National Science Foundation (NSF) program directors with experience running discipline-based education research (DBER) panels. Whether it be a manuscript or grant proposal, the outcome of peer review can greatly influence academic careers and the impact of research on a field. Yet the criteria upon which reviewers base their recommendations and the processes they follow as they review are poorly understood. Mentees reviewed three previously submitted proposals to the NSF and drafted pre-panel reviews regarding the proposals’ intellectual merit and broader impacts, strengths, and weaknesses relative to solicitation-specific criteria. After participation in one mock review panel, mentees could then revise their pre-review evaluations based on the panel discussion. Using a lens of transformative learning theory, this study sought to answer the following research questions: 1) What are the tacit criteria used to inform recommendations for grant proposal reviews among scholars new to the review process? 2) To what extent are there changes in these tacit criteria and subsequent recommendations for grant proposal reviews after participation in a mock panel review? Using a single case study approach to explore one mock review panel, we conducted document analyses of six mentees’ reviews completed before and after their participation in the mock review panel. Findings from this study suggest that reviewers primarily focus on the positive broader impacts proposed by a study and the level of detail within a submitted proposal. Although mentees made few changes to their reviews after the mock panel discussion, changes which were present illustrate that reviewers more deeply considered the broader impacts of the proposed studies. These results can inform review panel practices as well as approaches to training to support new reviewers in DBER fields. 
    more » « less
  3. The Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) program, managed by the U.S. National Science Foundation (NSF), provides grants to institutions of higher education to disburse scholarships for low-income, high-achieving domestic students enrolled in a STEM major. Despite the crucial role that two-year colleges (2YCs) epitomize in providing open-access affordable education to a diverse student population, the majority of NSF S-STEM scholarships are awarded to four-year institutions, which tend to have specialized personnel working on the preparation and submission of proposals. In this paper, we report a summary of the activities and evaluation of a "Capacity Building Workshops for Competitive S-STEM Proposals from Two-Year Colleges in the Western U.S.", funded by the NSF S-STEM program, aiming to facilitate submissions to the NSF S-STEM program from two-year colleges (2YCs). The workshop was offered in 2019 (in person) and in 2020 and 2021 (virtual), initially to support 2YCs in the Western region of the US and was expanded nationwide in 2020. During participation in the two-day workshop, several aspects of proposal submission were reviewed, in particular, the two NSF Merit Review Criteria of Intellectual Merit and Broader Impacts. Pre- and post- workshop support was also available via virtual office hours and webinars that addressed specific elements required to be included in S-STEM proposals. The evaluation of the workshop has been performed via post-workshop survey administered through Qualtrics™. A journal paper reporting on the evaluation of all three offerings of the workshop has been submitted and currently in review. In this paper, we intend to reflect on the successful features of this workshop series and the lessons learned throughout the three offerings. Over three years, 2019, 2020 and 2021, the program supported 103 participants on 51 teams from 2YCs. The program assisted at least 31 2YCs submit their S-STEM proposals to NSF, and 12 of these 2YCs received S-STEM grants. An additional 2YC proposal was first recommended for an award, but the proposal was subsequently declined for reasons unconnected to the content of proposal itself. The 3-year funding rate is 39%; if the above-mentioned proposal that received an award recommendation but was then declined is taken into account, the award rate is 42%. 
    more » « less
  4. Engineering education is increasingly looking to the liberal arts to broaden and diversify preparation of students for professional careers. The present study involves an elective graduate environmental engineering course that incorporated the arts and humanities. The goal of the course was to develop engineers and technical professionals who would become both more appreciative of and better equipped to address technical, ethical, social, and cultural challenges in engineering through the development of critical and reflective thinking skills and reflective practice in their professional work. A reflective writing assignment was submitted by students following each of fourteen course topics in response to the following question: Reflect on how you might want to apply what you learned to your development as a professional and/or to your daily life. Student responses were classified by human coders using qualitative text analytic methods and their classifications were attempted to be learned by a simple machine classifier. The goal of this analysis was to identify and quantify students’ reflections on prospective behaviors that emerged through participation in the course. The analysis indicated that the primary focus of students’ responses was self-improvement, with additional themes involving reflection, teamwork, and improving the world. The results provide a glimpse into how broadening and diversifying the curriculum might shape students’ thinking in directions that are more considerate of their contributions to their profession and society. In the discussion, we consider the findings from the human and machine assessments and suggest how incorporating AI machine methods into engineering provides new possibilities for engineering pedagogy. 
    more » « less
  5. This is a subset of the data found in Grove and Locke (2018), to be included with: Locke, D.H., Polsky, C., Grove, J. M., Groffman, P. M., Nelson, K.C., Larson, K. L., Cavender-Bares, J., Heffernan, J. B., Roy Chowdhury, R., Hobbie, S. E., Bettez, N., Neill, C., Ogden, L.A., O’Neil-Dunne, J. P. M.. [accepted]. Heterogeneity of practice underlies the homogeneity of ecological outcomes of United States yard care in metropolitan regions, neighborhoods and households. PLoS ONE doi:10.1371/journal.pone.0222630 These data contain answers 2011 survey questions: In the past year, which of the following has been applied to any part of your yard: Water for irrigating grass, plants, or trees? Fertilizers? Pesticides to get rid of weeds or pests? The total household annual income (8 ordinal categories), age of respondent (5 ordinal categories), and the answer to: About how many neighbors do you know by name? (recorded in 5 ordinal categories). Two additional columns are provided to indicate the metropolitan region of the respondent (one of the following six: Phoenix, Los Angeles, Minneapolis - St. Paul, Baltimore, Boston, or Miami) and the degree of urbanicity in that region (Urban, Suburban, or Exurban). See Grove and Locke 2018 for additional details. This research is supported by the Macro- Systems Biology Program (US NSF) under Grants EF-1065548, -1065737, -1065740, -1065741, -1065772, -1065785, -1065831, and -121238320 and the NIFA McIntire-Stennis 1000343 MIN-42-051. The work arose from research funded by grants from the NSF LTER program for Baltimore (DEB- 0423476, DEB-1027188); Phoenix (BCS-1026865, DEB-0423704, DEB-9714833, DEB-1637590, DEB-1832016); Plum Island, Boston (OCE-1058747 and 1238212); Cedar Creek, Minneapolis–St. Paul (DEB- 0620652); and Florida Coastal Everglades, Miami (DBI-0620409). Edna Bailey Sussman Foundation, Libby Fund Enhancement Award and the Marion I. Wright ‘46 Travel Grant at Clark University, The Warnock Foundation, the USDA Forest Service Northern Research Station, Baltimore and Philadelphia Field Stations, and the DC-BC ULTRA-Ex NSF-DEB-0948947 also provided support. This work was supported by the National Socio-Environmental Synthesis Center (SESYNC) under funding received from the National Science Foundation DBI-1052875. Anonymous reviewers supplied constructive feedback that helped to improve this paper. The findings and opinions reported here do not necessarily reflect those of the funders of this research. Citations: Grove J.M., Locke, D.H.. (2018). BES Household Telephone Survey. Environmental Data Initiative. https://doi.org/10.6073/pasta/5a4fc7bfa199f3d63748f0853ae073a0. 
    more » « less