EMBO Press, Meyerhofstrasse 1, Heidelberg, Germany
Print servers allow the efficient sharing and discussion of scientific findings without restriction or delay. Quality assurance through peer review and editorial processes are more important than ever, given the rapid growth and increased complexity of scientific information. I will argue that both processes can work optimally in sequence in an information flow from bench to lab-book to preprint to paper.
Preprint based communication has been firmly established as the norm in many areas of the physical sciences for decades; in none of these disciplines has it replaced the peer reviewed research paper. Instead, a symbiotic relationship appears to have developed, where preprints preclude damaging delays to scientific communication while buffering pressures to rush into publication prematurely.
The reason for the success of the preprint paradigm appears to be two-fold: an appreciation by the community that a preprint posting serves as a time stamp to mark priority of discovery, and an appreciation that suggestions by colleagues can improve the way a piece of research is reported, or even lead to improvement of the research itself. Preprints can be seen as a formal extension of presenting unpublished findings at conferences, with the advantages of opening up the information to the whole community and in return promising useful comments from interested colleagues, and of stably archiving the information in a citable manner. Importantly, versioning allows the information to be adapted as research projects mature, and in response to comments, reducing the temptation to delay communication of findings until each t is crossed and i dotted.
Preprint based communication is not the end point in sharing research findings – journals in the physical sciences continue to thrive. The reason is that formal peer review and quality editorial processes add a level of quality control, resulting in published research output having a high probability of being significant and reproducible.
The editorial process at multidisciplinary journals such as Nature is notably different between the physical sciences and biological sciences. In the former, comments and pace are more measured and papers are published with fewer rounds of revision. One is tempted to conclude this is because improvements were already made upstream and that consequently referees feel more comfortable to assume they are tasked with validating mature work, rather than assuming their role is to help shape research projects. Likely, authors in turn feel less pressure to rush publication, submitting mature datasets and investing time for thorough, comprehensive revision. There is no doubt that part of the reason why journals remain stably embedded in the research process in both biology and physics is the major role they play in research assessment, which often relies on journal name and journal impact metrics as proxies for quality. Nevertheless, in my view quality journals apply well-honed processes of quality control that cannot realistically be expected to be replaced by post-publication commenting. For sure, much can be improved in both the peer review processes and the editorial processes in the biological sciences. Establishing preprint servers as a community standard alongside these processes would do much to allow editors and researchers alike to invest more resources in applying these important enhancements. Preprint posts and peer reviewed papers are highly complementary, not competitive.
Below, I highlight a number of issues, in the hope that they will invigorate discussion at ASAPbio:
In most areas of biology, the volume and complexity of published research output is already well exceeding what a single individual can possibly survey, let alone absorb. I think it is important to ensure that preprint platforms should apply minimal quality control so that all posts meet basic community standards. These will vary by field and it would be important for research communities to agree on such minimum standards now.
The preprint unit
Currently, one key criterion for the bioRxiv preprint server is that posts resemble a classical research paper format. I would propose that preprint servers should host not only work that is poised to be submitted for publication as a research paper, but encourage also smaller quanta of information framed around datasets or individual experiments.
The râison d’ être of a preprint server is the sharing of research findings without delay, charge and minimal licensing restrictions. There is limited value of sharing of reviews and commentaries on preprint servers.
Preprints are not intended to provide a cheap alternative to journal publishing and it is in my view understandable that journals may be perfectly happy to publish research papers based on preprint posts, but refuse to re-publish commentaries and reviews after they have been posted in a stable and citable manner on a preprint server.
To ensure every reader is able to clearly distinguish between the non-peer reviewed, and the peer-reviewed, edited literature, preprints should be referred to as posts; peer reviewed articles are ‘research papers’.
According to the notion of a linear information flow from bench via preprint to paper, I propose to ensure accurate forward linking of preprint posts to subsequent posts (versioning) and to linked peer reviewed research papers. Formally, the first preprint post is v0 of a piece of research and sets the discovery date, v0.1, v0.2 etc. adaptations by the authors of their preprint, v1 the published research paper (which in turn will also be subject to subsequent versioning (v1.1, v1.2, etc.).
Encouraging preprint posting
Preprints will only be used by the community if researchers are confident that all quality journals have explicit policies that preprints are not considered prior publication. Use will be further enhanced if journals adopt clear policies to ensure that research articles are not considered ‘scooped’ by unrelated research published after formal submission to a journal or during revision.
Preprint posting would be strongly encouraged by journals setting up means for direct ‘one click’ transfers from preprint server and, importantly, considering submission irrespective of format (as long as the basic criteria of a research paper are met).
Analogous to conferences, journal editors may further encourage use by actively encouraging submission of preprint posts.
Aggregate, don’t disperse
The ‘self archiving’ or ‘green route’ OA concept is burdened by the inefficiency of duplicated institutional or funder repositories that are not usually connected or standardized. Such a dispersed infrastructure is not only costly to set up and maintain stably, it is also of limited use. This problem should not be promulgated by a proliferation of preprint servers launched by publishers with the primary goal to attract manuscripts to their journals. If preprints themselves become a value proposition in research assessment, scientists will be ready to pay for prestige and this risks developing a hierarchy of preprints and for profit business models at the expense of optimal information sharing.
In my view, preprint posting should remain free and the basic hosting and quality control centrally funded.
Note: the views presented are personal; they are consistent with EMBO Press policies