There have been numerous efforts documenting the effects of open science in existing papers; however, these efforts typically only consider the author’s analyses and supplemental materials from the papers. While understanding the current rate of open science adoption is important, it is also vital that we explore the factors that may encourage such adoption. One such factor may be publishing organizations setting open science requirements of submitted arti- cles: encouraging researchers to adopt more rigorous reporting and research practices. For example, within the education technology discipline, the ACM Conference on Learning @ Scale (L@S) has been promoting open science practices since 2018 through a Call For Pa- pers statement. The purpose of this study was to replicate previous papers within the proceedings of L@S and compare the degree of open science adoption and robust reproducibility practices to other conferences in education technology without a statement on open science. Specifically, we examined 93 papers and documented the open science practices used. We then attempted to reproduce the results with intervention from authors to bolster the chance of suc- cess. Finally, we compared the overall adoption rates to those from other conferences in education technology. Our cursory review sug- gests that researchers at L@S were more knowledgeable in open science practices, such as preregistration or preprints, compared to the researchers who published in International Conference on Artificial Intelligence in Education and the International Conference on Educational Data Mining as they were less likely to say they were unfamiliar with the practices. However, the overall adoption of open science practices was significantly lower with only 1% of papers providing open data, 5% providing open materials, and no papers with a preregistration. Based on speculation, the low adoption rates maybe due to 20% of the papers not using a dataset, at-scale datasets and materials that were unable to be released to avoid security issues or sensitive data leaks, or that data were being used in ongoing research and are not considered complete enough for release by the authors. All openly accessible work can be found in an Open Science Framework project
more »
« less
Champions of Transparency in Education: What Journal Reviewers Can Do to Encourage Open Science Practices
As the field of education, and especially gifted education, gradually moves toward open science, our research community increasingly values transparency and openness brought by open science practices. Yet, individual researchers may be reluctant to adopt open science practices due to low incentives, barriers of extra workload, or lack of support to apply these in certain areas, such as qualitative research. We encourage and give guidelines to reviewers to champion open science practices by warmly influencing authors to consider applying open science practices to quantitative, qualitative, and mixed-methods research and providing ample support to produce higher-quality publications. Instead of imposing open science practices on authors, we advocate reviewers suggest small, non-threatening, specific steps to support authors without making them feel overwhelmed, judged, or punished. We believe that these small steps taken by reviewers will make a difference to create a more supportive environment for researchers to adopt more open science practices.
more »
« less
- Award ID(s):
- 1937698
- PAR ID:
- 10550304
- Publisher / Repository:
- EdArXiv
- Date Published:
- Journal Name:
- Gifted Child Quarterly
- Volume:
- 67
- Issue:
- 4
- ISSN:
- 0016-9862
- Page Range / eLocation ID:
- 337 to 351
- Format(s):
- Medium: X
- Sponsoring Org:
- National Science Foundation
More Like this
-
-
There have been numerous efforts documenting the effects of open science in existing papers; however, these efforts typically only consider the author's analyses and supplemental materials from the papers. While understanding the current rate of open science adoption is important, it is also vital that we explore the factors that may encourage such adoption. One such factor may be publishing organizations setting open science requirements for submitted articles: encouraging researchers to adopt more rigorous reporting and research practices. For example, within the education technology discipline, theACM Conference on Learning @ Scale (L@S) has been promoting open science practices since 2018 through a Call For Papers statement. The purpose of this study was to replicate previous papers within the proceedings of L@S and compare the degree of open science adoption and robust reproducibility practices to other conferences in education technology without a statement on open science. Specifically, we examined 93 papers and documented the open science practices used. We then attempted to reproduce the results with invitation from authors to bolster the chance of success. Finally, we compared the overall adoption rates to those from other conferences in education technology. Although the overall responses to the survey were low, our cursory review suggests that researchers at L@S might be more familiar with open science practices compared to the researchers who published in the International Conference on Artificial Intelligence in Education (AIED) and the International Conference on Educational Data Mining (EDM): 13 of 28 AIED and EDM responses were unfamiliar with preregistrations and 7 unfamiliar with preprints, while only 2 of 7 L@S responses were unfamiliar with preregistrations and 0 with preprints. The overall adoption of open science practices at L@S was much lower with only 1% of papers providing open data, 5% providing open materials, and no papers had a preregistration. All openly accessible work can be found in an Open Science Framework project.more » « less
-
Casadevall, Arturo (Ed.)ABSTRACT In this editorial, written by early-career scientists, we advocate for the invaluable role of society journals in our scientific community. By choosing to support these journals as authors, peer reviewers, and as editors, we can reinforce our academic growth and benefit from their re-investment back into the scientific ecosystem. Considering the numerous clear merits of this system for future generations of microbiologists and more broadly, society, we argue that early-career researchers should publish our high-quality research in society journals to shape the future of science and scientific publishing landscape.more » « less
-
Abstract Machine learning (ML) has become commonplace in educational research and science education research, especially to support assessment efforts. Such applications of machine learning have shown their promise in replicating and scaling human‐driven codes of students' work. Despite this promise, we and other scholars argue that machine learning has not yet achieved its transformational potential. We argue that this is because our field is currently lacking frameworks for supporting creative, principled, and critical endeavors to use machine learning in science education research. To offer considerations for science education researchers' use of ML, we present a framework, Distributing Epistemic Functions and Tasks (DEFT), that highlights the functions and tasks that pertain to generating knowledge that can be carried out by either trained researchers or machine learning algorithms. Such considerations are critical decisions that should occur alongside those about, for instance, the type of data or algorithm used. We apply this framework to two cases, one that exemplifies the cutting‐edge use of machine learning in science education research and another that offers a wholly different means of using machine learning and human‐driven inquiry together. We conclude with strategies for researchers to adopt machine learning and call for the field to rethink how we prepare science education researchers in an era of great advances in computational power and access to machine learning methods.more » « less
-
Within the field of education technology, learning analytics has increased in popularity over the past decade. Researchers conduct experiments and develop software, building on each other’s work to create more intricate systems. In parallel, open science — which describes a set of practices to make research more open, transparent, and reproducible — has exploded in recent years, resulting in more open data, code, and materials for researchers to use. However, without prior knowledge of open science, many researchers do not make their datasets, code, and materials openly available, and those that are available are often difficult, if not impossible, to reproduce. The purpose of the current study was to take a close look at our field by examining previous papers within the proceedings of the International Conference on Learning Analytics and Knowledge, and document the rate of open science adoption (e.g., preregistration, open data), as well as how well available data and code could be reproduced. Specifically, we examined 133 research papers, allowing ourselves 15 minutes for each paper to identify open science practices and attempt to reproduce the results according to their provided specifications. Our results showed that less than half of the research adopted standard open science principles, with approximately 5% fully meeting some of the defined principles. Further, we were unable to reproduce any of the papers successfully in the given time period. We conclude by providing recommendations on how to improve the reproducibility of our research as a field moving forward. All openly accessible work can be found in an Open Science Foundation project1.more » « less
An official website of the United States government

