skip to main content

This content will become publicly available on August 23, 2023

Title: Changing the Paradigm: Developing a Framework for Secondary Analysis of EER Qualitative Datasets
This paper reports on a project funded through the Engineering Education and Centers (EEC) Division of the National Science Foundation. Since 2010, EEC has funded more than 500 proposals totaling over $150 million through engineering education research (EER) programs such as Research in Engineering Education (REE) and Research in the Formation of Engineers (RFE), to enhance understanding and improve practice. The resulting archive of robust qualitative and quantitative data represents a vast untapped potential to exponentially increase the impact of EEC funding and transform engineering education. But tapping this potential has thus far been an intractable problem, despite ongoing calls for data sharing by public funders of research. Changing the paradigm of single-use data collection requires actionable, proven practices for effective, ethical data sharing, coupled with sufficient incentives to both share and use existing data. To that end, this project draws together a team of experts to overcome substantial obstacles in qualitative data sharing by building a framework to guide secondary analysis in engineering education research (EER), and to test this framework using pioneering data sets. Herein, we report on accomplishments within the first year of the project during which time we gathered a group of 13 expert qualitative researchers more » to engage in the first of a series of working meetings intended to meet our project goals. We came into this first workshop with a potentially limiting definition of secondary data analysis and the idea that people would want to share existing datasets if we could find ways around anticipated hurdles. However, the workshop yielded a broader definition of secondary data analysis and revealed a stronger interest in creating new datasets designed for sharing rather than sharing existing datasets. Thus, we have reconceived our second phase as one that is a cohesive effort based on an inclusive “open cohort model” to pilot projects related to secondary data analysis. « less
Authors:
; ; ; ;
Award ID(s):
2039871
Publication Date:
NSF-PAR ID:
10392042
Journal Name:
2022 ASEE Annual Conference & Exposition
Sponsoring Org:
National Science Foundation
More Like this
  1. Our work with teams funded through the National Science Foundation REvolutionizing Engineering and Computer Science Departments (RED) program began in 2015. Our project—funded first by a NSF EAGER grant, and then by a NSF RFE grant—focuses on understanding how the RED teams make change on their campuses and how this information about change can be captured and communicated to other STEM programs that seek to make change happen. Because our RED Participatory Action Research (REDPAR) Project is a collaboration between researchers (Center for Evaluation & Research for STEM Equity at the University of Washington) and practitioners (Making Academic Change Happen Workshop at Rose-Hulman Institute of Technology), we have challenged ourselves to develop means of communication that allow for both aspects of the work—both research and practice—to be treated equitably. As a result, we have created a new dissemination channel—the RED Participatory Action Project Tipsheet. The tipsheet format accomplishes several important goals. First, the content is drawn from both the research conducted with the RED teams and the practitioners’ work with the teams. Each tipsheet takes up a single theme and grounds the theme in the research literature while offering practical tips for applying the information. Second, the format is accessiblemore »to a wide spectrum of potential users, remaining free of jargon and applicable to multiple program and departmental contexts. Third, by publishing the tipsheets ourselves, rather than submitting them to an engineering education research journal, we make the information timely and freely available. We can make a tipsheet as soon as a theme emerges from the intersection of research data and observations of practice. During the poster session at ASEE 2019, we will share the three REDPAR Tipsheets that have been produced thus far: Creating Strategic Partnerships, Communicating Change, and Shared Vision. We will also work with attendees to demonstrate how the tipsheet content is adaptable to the attendees’ specific academic context. Our goal for the poster session is to provide attendees with tipsheet resources that are useful to their specific change project.« less
  2. Our work with teams funded through the National Science Foundation REvolutionizing Engineering and Computer Science Departments (RED) program began in 2015. Our project—funded first by a NSF EAGER grant, and then by a NSF RFE grant—focuses on understanding how the RED teams make change on their campuses and how this information about change can be captured and communicated to other STEM programs that seek to make change happen. Because our RED Participatory Action Research (REDPAR) Project is a collaboration between researchers (Center for Evaluation & Research for STEM Equity at the University of Washington) and practitioners (Making Academic Change Happen Workshop at Rose-Hulman Institute of Technology), we have challenged ourselves to develop means of communication that allow for both aspects of the work—both research and practice—to be treated equitably. As a result, we have created a new dissemination channel—the RED Participatory Action Project Tipsheet. The tipsheet format accomplishes several important goals. First, the content is drawn from both the research conducted with the RED teams and the practitioners’ work with the teams. Each tipsheet takes up a single theme and grounds the theme in the research literature while offering practical tips for applying the information. Second, the format is accessiblemore »to a wide spectrum of potential users, remaining free of jargon and applicable to multiple program and departmental contexts. Third, by publishing the tipsheets ourselves, rather than submitting them to an engineering education research journal, we make the information timely and freely available. We can make a tipsheet as soon as a theme emerges from the intersection of research data and observations of practice. During the poster session at ASEE 2019, we will share the three REDPAR Tipsheets that have been produced thus far: Creating Strategic Partnerships, Communicating Change, and Shared Vision. We will also work with attendees to demonstrate how the tipsheet content is adaptable to the attendees’ specific academic context. Our goal for the poster session is to provide attendees with tipsheet resources that are useful to their specific change project.« less
  3. This WIP presentation is intended to share and gather feedback on the development of an observation protocol for K-12 integrated STEM instruction, the STEM-OP. Specifically, the STEM-OP is being developed for use in K-12 science and/or engineering settings where integrated STEM instruction takes place. While the importance of integrated STEM education is established through national policy documents, there remains disagreement on models and effective approaches for integrated STEM instruction. Our broad definition of integrated STEM includes the use of two or more STEM disciplines to solve a real-world problem or design challenge that supports student development of 21st century skills. This issue is confounded by the lack of observation protocols sensitive to integrated STEM teaching and learning that can be used to inform research of the effectiveness of new models and strategies. Existing instruments most commonly used by researchers, such as the Reformed Teaching Observation Protocol (RTOP), were designed prior to the development of the Next Generation Science Standards and the integration of engineering into science standards. These instruments were also designed for use in reform-based science classrooms, not engineering or integrated STEM learning environments. While engineering-focused observation protocols do exist for K-12 classrooms, they do not evaluate beyond anmore »engineering focus, making them limited tools to evaluate integrated STEM instruction. In order to facilitate the implementation of integrated STEM in K-12 classrooms and the development of the nascent integrated STEM education literature, our research team is developing a new integrated STEM observation protocol for use in K-12 science and engineering classrooms. This valid and reliable instrument will be designed for use in a variety of educational contexts and by different education stakeholders to increase the quality of K-12 STEM education. At the end of this project, the STEM-OP will be made available through an online platform that will include an embedded training program to facilitate its broad use. In the first year of this four-year project, we are working on the initial development of the STEM-OP through video analysis and exploratory factor analysis. We are utilizing existing classroom video from a previous project with approximately 2,000 unique classroom videos representing a variety of grade levels (4-9), science content (life, earth, and physical science), engineering design challenges, and school demographics (urban, suburban). The development of the STEM-OP is guided by published frameworks that focus on providing quality K-12 integrated STEM and engineering education, such as the Framework for Quality K-12 Engineering Education. Our anticipated results at the time the ASEE meeting will include a review of our item development process and finalized items included on the draft STEM-OP. Additionally, we anticipate being able to share findings from the exploratory factor analysis (EFA) on our video-coded data, which will identify distinct instructional dimensions responsible for integrated STEM instruction. We value the opportunity to gather feedback from the engineering education community as the integration of engineering design and practices is integral to quality integrated STEM instruction.« less
  4. Our NSF funded project—Creating National Leadership Cohorts to Make Academic Change Happen (NSF 1649318)—represents a strategic partnership between researchers and practitioners in the domain of academic change. The principle investigators from the Making Academic Change Happen team from Rose-Hulman Institute of Technology provide familiarity with the literature of practical organizational change and package this into action-oriented workshops and ongoing support for teams funded through the REvolutionizing engineering and computer science Departments (RED) program. The PIs from the Center for Evaluation & Research for STEM Equity at the University of Washington provide expertise in social science research in order to investigate how the the RED teams’ change projects unfold and how the teams develop as members of national leadership cohorts for change in engineering and computer science education. Our poster for ASEE 2018 will focus on what we have learned thus far regarding the dynamics of the researcher/practitioner partnership through the RED Participatory Action Research (REDPAR) Project. According to Worrall (2007), good partnerships are “founded on trust, respect, mutual benefit, good communities, and governance structures that allow democratic decision-making, process improvement, and resource sharing.” We have seen these elements emerge through the work of the partnership to create mutual benefits. Formore »example, the researchers have been given an “insider’s” perspective on the practitioners’ approach—their goals, motivations for certain activities, and background information and research. The practitioners’ perspective is useful for the researchers to learn since the practitioners’ familiarity with the organizational change literature has influenced the researchers’ questions and theoretical models. The practitioners’ work with the RED teams has provided insights on the teams, how they are operating, the challenges they face, and aspects of the teams’ work that may not be readily available to the researchers. As a result, the researchers have had increased access to the teams to collect data. The researchers, in turn, have been able to consider how to make their analyses useful and actionable for change-makers, the population that the practitioners are more familiar with. Insights from the researchers provide both immediate and long-term benefits to programming and increased professional impact. The researchers are trained observers, each of whom brings a unique disciplinary perspective to their observations. The richness, depth, and clarity of their observations adds immeasurably to the quality of practitioners’ interactions with the RED teams. The practitioners, for example, have revised workshop content in response to the researchers’ observations, thus ensuring that the workshop content serves the needs of the RED teams. The practitioners also benefit from the joint effort on dissemination, since they can contribute to a variety of dissemination efforts (journal papers, conference presentations, workshops). We plan to share specific examples of the strategic partnership during the poster session. In doing so, we hope to encourage researchers to seek out partnerships with practitioners in order to bridge the gap between theory and practice in engineering and computer science education.« less
  5. National Science Foundation (NSF) funded Engineering Research Centers (ERC) must complement their technical research with various education and outreach opportunities to: 1) improve and promote engineering education, both within the center and to the local community; 2) encourage and include the underrepresented populations to participate in Engineering activities; and 3) advocate communication and collaboration between industry and academia. ERCs ought to perform an adequate evaluation of their educational and outreach programs to ensure that beneficial goals are met. Each ERC has complete autonomy in conducting and reporting such evaluation. Evaluation tools used by individual ERCs are quite similar, but each ERC has designed their evaluation processes in isolation, including evaluation tools such as survey instruments, interview protocols, focus group protocols, and/or observation protocols. These isolated efforts resulted in redundant resources spent and lacking outcome comparability across ERCs. Leaders from three different ERCs led and initiated a collaborative effort to address the above issue by building a suite of common evaluation instruments that all current and future ERCs can use. This leading group consists of education directors and external evaluators from all three partners ERCs and engineering education researchers, who have worked together for two years. The project intends to addressmore »the four ERC program clusters: Broadening Participation in Engineering, Centers and Networks, Engineering Education, and Engineering Workforce Development. The instruments developed will pay attention to culture of inclusion, outreach activities, mentoring experience, and sustained interest in engineering. The project will deliver best practices in education program evaluation, which will not only support existing ERCs, but will also serve as immediate tools for brand new ERCs and similar large-scale research centers. Expanding the research beyond TEEC and sharing the developed instruments with NSF as well as other ERCs will also promote and encourage continual cross-ERC collaboration and research. Further, the joint evaluation will increase the evaluation consistency across all ERC education programs. Embedded instrumental feedback loops will lead to continual improvement to ERC education performance and support the growth of an inclusive and innovative engineering workforce. Four major deliveries are planned. First, develop a common quantitative assessment instrument, named Multi-ERC Instrument Inventory (MERCII). Second, develop a set of qualitative instruments to complement MERCII. Third, create a web-based evaluation platform for MERCII. Fourth, update the NSF ERC education program evaluation best practice manual. These deliveries together will become part of and supplemented by an ERC evaluator toolbox. This project strives to significantly impact how ERCs evaluate their educational and outreach programs. Single ERC based studies lack the sample size to truly test the validity of any evaluation instruments or measures. A common suite of instruments across ERCs would provide an opportunity for a large scale assessment study. The online platform will further provide an easy-to-use tool for all ERCs to facilitate evaluation, share data, and reporting impacts.« less