Many publications on COVID-19 were released on preprint servers such as medRxiv and bioRxiv. It is unknown how reliable these preprints are, and which ones will eventually be published in scientific journals. In this study, we use crowdsourced human forecasts to predict publication outcomes and future citation counts for a sample of 400 preprints with high Altmetric score. Most of these preprints were published within 1 year of upload on a preprint server (70%), with a considerable fraction (45%) appearing in a high-impact journal with a journal impact factor of at least 10. On average, the preprints received 162 citations within the first year. We found that forecasters can predict if preprints will be published after 1 year and if the publishing journal has high impact. Forecasts are also informative with respect to Google Scholar citations within 1 year of upload on a preprint server. For both types of assessment, we found statistically significant positive correlations between forecasts and observed outcomes. While the forecasts can help to provide a preliminary assessment of preprints at a faster pace than traditional peer-review, it remains to be investigated if such an assessment is suited to identify methodological problems in preprints. 
                        more » 
                        « less   
                    
                            
                            Scientific journals still matter in the era of academic search engines and preprint archives
                        
                    
    
            Abstract Journals play a critical role in the scientific process because they evaluate the quality of incoming papers and offer an organizing filter for search. However, the role of journals has been called into question because new preprint archives and academic search engines make it easier to find articles independent of the journals that publish them. Research on this issue is complicated by the deeply confounded relationship between article quality and journal reputation. We present an innovative proxy for individual article quality that is divorced from the journal's reputation or impact factor: the number of citations to preprints posted onarXiv.org. Using this measure to study three subfields of physics that were early adopters of arXiv, we show that prior estimates of the effect of journal reputation on an individual article's impact (measured by citations) are likely inflated. While we find that higher‐quality preprints in these subfields are now less likely to be published in journals compared to prior years, we find little systematic evidence that the role of journal reputation on article performance has declined. 
        more » 
        « less   
        
    
                            - Award ID(s):
- 1735194
- PAR ID:
- 10372837
- Publisher / Repository:
- Wiley Blackwell (John Wiley & Sons)
- Date Published:
- Journal Name:
- Journal of the Association for Information Science and Technology
- Volume:
- 71
- Issue:
- 10
- ISSN:
- 2330-1635
- Format(s):
- Medium: X Size: p. 1218-1226
- Size(s):
- p. 1218-1226
- Sponsoring Org:
- National Science Foundation
More Like this
- 
            
- 
            Abstract Journal editors have a large amount of power to advance open science in their respective fields by incentivising and mandating open policies and practices at their journals. The Data PASS Journal Editors Discussion Interface (JEDI, an online community for social science journal editors:www.dpjedi.org) has collated several resources on embedding open science in journal editing (www.dpjedi.org/resources). However, it can be overwhelming as an editor new to open science practices to know where to start. For this reason, we created a guide for journal editors on how to get started with open science. The guide outlines steps that editors can take to implement open policies and practices within their journal, and goes through the what, why, how, and worries of each policy and practice. This manuscript introduces and summarizes the guide (full guide:https://doi.org/10.31219/osf.io/hstcx).more » « less
- 
            Abstract In our previous article (http://arxiv.org/abs/1607.06041), we established an equivalence between pointed pivotal module tensor categories and anchored planar algebras. This article introduces the notion of unitarity for both module tensor categories and anchored planar algebras, and establishes the unitary analog of the above equivalence. Our constructions use Baez’s 2-Hilbert spaces (i.e., semisimple$$\textrm{C}^*$$ -categories equipped with unitary traces), the unitary Yoneda embedding, and the notion of unitary adjunction for dagger functors between 2-Hilbert spaces.more » « less
- 
            Identifying the Most Cited Articles and Authors in Educational Psychology Journals from 1988 to 2023Abstract Over the past 30 years, several reviews have examined scholarly contributions of individual researchers and institutions in the field of educational psychology (Fong et al., Educational Psychology Review 34:2379–2403, 2022; Greenbaum et al., Educational Psychology Review 28:215–223, 2016; Hsieh et al., Contemporary Educational Psychology 29:333–343, 2004; Jones et al., Contemporary Educational Psychology 35:11–16, 2010; Smith et al., Contemporary Educational Psychology 23:173–181, 1998; Smith et al., Contemporary Educational Psychology 28:422– 430, 2003). However, no reviews have specifically examined scholarly impact as measured by citations since (Walberg, Current Contents 22:5–14, 1990) did so over 34 years ago. The present review focused on the period from 1988 to 2023, identifying the most cited articles and authors since Walberg’s study that focused on the period from 1966–1988. Whereas most of the previous reviews have been limited in terms of brief time periods (e.g., six years) and a small set of journals (e.g., five), our scope included 12 educational psychology journals across 36 years. The most cited article (over 9000 times) by (Ryan and Deci, Contemporary Educational Psychology 25:54–67, 2000) had more than twice as many citations as the second most cited article by (Pintrich and Groot, Journal of Educational Psychology 82:33–40, 1990). Most of the top 30 most cited articles, including four of the top five, addressed the topic of motivation. With regard to highly cited authors, the top five were John Sweller, Richard E. Mayer, Fred Paas, Richard M. Ryan, and Reinhard Pekrun. Several of the 30 most cited authors have never appeared in previous lists of most productive authors. Finally, keyword and cluster analyses revealed most popular topics and collaborative networks among many of the most cited authors that may partly explain their productivity. Examining article and author impact is an important complement to productivity when considering scholarly contributions to the field of educational psychology.more » « less
- 
            null (Ed.)This article describes the motivation, design, and progress of the Journal of Open Source Software (JOSS). JOSS is a free and open-access journal that publishes articles describing research software. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. The article is the entry point of a JOSS submission, which encompasses the full set of software artifacts. Submission and review proceed in the open, on GitHub. Editors, reviewers, and authors work collaboratively and openly. Unlike other journals, JOSS does not reject articles requiring major revision; while not yet accepted, articles remain visible and under review until the authors make adequate changes (or withdraw, if unable to meet requirements). Once an article is accepted, JOSS gives it a digital object identifier (DOI), deposits its metadata in Crossref, and the article can begin collecting citations on indexers like Google Scholar and other services. Authors retain copyright of their JOSS article, releasing it under a Creative Commons Attribution 4.0 International License. In its first year, starting in May 2016, JOSS published 111 articles, with more than 40 additional articles under review. JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative (OSI).more » « less
 An official website of the United States government
An official website of the United States government 
				
			 
					 
					
