By Robert Kiley, Head of Open Research, Wellcome
Introduction
In November 2016 Wellcome became the first research funder to launch a publishing platform for the exclusive use of its grantholders. Wellcome Open Research (WOR), run on behalf of Wellcome by Faculty of 1000 (F1000), uses a model of immediate publication followed by invited, post-publication peer review. All reviews are citable and posted to the platform along with the identities of the reviewers.
This short blog post discusses the motivations behind establishing this publishing platform, and presents some data about the peer review process, as practised at WOR.
Motivations for developing Wellcome Open Research
For over a decade Wellcome has been at the forefront of the open access (OA) and data sharing debates. We believe that to maximise the impact of our research spend – and help to deliver our mission to improve human health – all research outputs must be findable, accessible, interoperable and reusable.
In support of this aim we are seeking to improve the way research is communicated. Through WOR we believe we can make the process faster and more transparent, and make it easier for researchers to provide information that supports reproducibility. WOR also provides a platform for publishing any research finding that a Wellcome-funded researcher wishes to share. This includes traditional research articles, but also outputs such as data notes, protocols, method papers, software tools and null and negative results.
Post-publication peer review
One of the most innovative aspects of the platform is its use of invited, post-publication peer review. Under this model every submission is subjected to a range of editorial checks, such as plagiarism detection, checking the data availability statement, and ensuring that the appropriate ethical standards have been adhered to. Once these checks have been completed – and any issues addressed – the article is published.
On publication, the reviewers – as suggested by the corresponding author – are formally invited to review the article. As the research has already been published openly on the platform, reviewers can focus their efforts on providing help and advice on how the article can be improved, as well as highlighting any errors or omissions.
In addition to providing a narrative, reviewers also indicate whether the work as presented should be “approved,” “approved with reservations” or “not approved.” Only submissions that receive two “approved” statuses or one “approved” and two “approved with reservations” are indexed in PubMed and other bibliographic databases. Authors can submit a new version of the article – either to address comments from reviewers or to update the publication in light of any new findings – and the peer review process is repeated for each new version.
Author-initiated peer review
The logic of allowing the authors to suggest reviewers is that they are often best placed to do so. It does, however, raise concerns from some who fear that authors may suggest reviewers who they know or who they otherwise believe are likely to write a less critical review.
In terms of managing potential conflicts of interest, all reviewers are required to declare any associations with the author(s). F1000 staff also seek to identify obvious conflicts, such as checking PubMed to determine if any authors have collaborated recently with the reviewer. Ultimately though we believe that this system will be self-policing as both the identity of the reviewers and the review they provide are fully disclosed. Just as sunshine is the best disinfectant, the transparency of this model should minimise any potential risks.
Determining whether an author may suggest reviewers who will give them an easy ride is harder to ascertain. However, an analysis of the peer review reports published to date shows that around 35% of Version 1 articles were “approved with reservations.” And, though only 2% of reviewer reports applied the “not approved” status, it is clear that reviewers are prepared to highlight serious problems and put their name to the review.
Table 1, below, provides data on the peer review process as seen at WOR in the first 14 months of operation.
Item | Number |
Total number of papers published at Wellcome Open Research (15th Nov 2016 – 19th January 2018) | 160 |
Total number of peer review reports published | 460 |
Focusing on version 1 submissions: | |
Total number of peer review reports for Version 1 articles | 385 |
Total number of peer review reports for Version 1 articles designated as “Approved” | 244 |
Total number of peer review reports for Version 1 articles designated as “Approved with reservations” | 132 |
Total number of peer review reports for Version 1 articles designated as “Not Approved” | 9 |
Focusing on version 2 submissions: | |
Total number of peer review reports for Version 2 articles | 75 |
Total number of peer review reports for Version 2 articles designated as “Approved” | 69 |
Total number of peer review reports for Version 2 articles designated as “Approved with reservations” | 5 |
Total number of peer review reports for Version 2 articles designated as “Not Approved” | 1 |
Table 1: Peer review data at WOR: November 2016 – January 2018
One other aspect of open peer review which is worth discussing is whether it is more difficult to attract reviewers, compared with a more traditional model where reviewer identifies are not disclosed.
Although we have not been able to do a comparative analysis for this blog post, data from WOR shows that to reach indexed status, on average 11 reviewers were approached. The median number of reviewers contacted – again, to achieve indexed status – is nine.
Although this process puts additional burden on the author – who is required to identify suitable reviewers – the transparency of this approach means that the author is fully aware of the status of the peer review process. And, overall this doesn’t seem to be adding significantly to the elapsed time from initial submission, to when an article has passed peer review and is indexed in PubMed. Analysis conducted in November 2017 showed that on average articles completed the entire publication workflow in 72 days.
Conclusion
Recent postings on the ASAPbio blog (here, here, here and here) have called upon the research community to consider moving to an open, transparent peer review system. From the experience of the first year of WOR we can conclude that this model is working effectively and encourage others to embrace it.