Life Science Editors (Helen Pickersgill, Angela Andersen, Marie Bao, Carol Featherstone, Shawna Hiley, Sabbi Lall, Li-Kuo Su)
Website or social media links
Current stage of development
Overview of the challenge to overcome
Peer review is central to the scientific process, but is managed primarily by journals behind closed doors, limiting its value to the community. Reviews of rejected papers, and of most published papers, are rarely posted. Peer review of preprints side-steps journal constraints and democratizes the evaluation of science by scientists. Peer review would improve preprint quality and curation but, without incentives and standards, the number and quality of reviews varies widely. Additionally, refereed preprints and their reviews are difficult to find. We propose to integrate peer review directly into preprint servers and hand control of this process back to authors and their peers.
The ideal outcome or output of the project
A system to generate consistent, valuable, discoverable, and open peer reviews. Most preprints would have reviews.
Description of the intervention
The author submits a preprint and can nominate a “Liaison team”: a lab/team headed by a PI to handle the peer review process. The team approach promotes training and diversity, and reduces biases. The server automatically sends an invitation: if it is accepted the process moves forward; if not, the author can suggest an alternative team or cancel the process.
• The Liaison team receives information on managing constructive peer review and invites multiple individuals/labs to review the preprint. The author can suggest reviewers.
• Reviewers have a unique identifier and can remain anonymous. Reviewers are searchable by reviews, scientific expertise, and peer review experience with grants, journals, and/or preprints.
• Reviewers are invited to watch videos on how to write a review (from the NIH) and on implicit bias. They receive a structured template to facilitate review, reduce bias, and improve consistency and searchability. Reviewers complete their review within two weeks and rate the preprint (1–5 stars).
• The Liaison team summarizes the reviews, gives an overall rating, links their summary and the reviews to the preprint. The reviews and summary are citable.
• Authors can rate the Liaison and the Reviews and respond.
• Refereed preprints are highlighted on the server and can be searched and sorted.
• Readers can rate the reviews (1-5 stars) and the preprint (1-5 stars; this rating is separate from the reviewer rating).
• Invitations from peers, citable outputs, and ratings create incentives.
Plan for monitoring project outcome
At baseline, 3 months, 12 months assess the number of: preprints with reviews, reviews with ratings, responses from authors, downloads and citations of preprints.
What’s needed for success
Additional technology development
Technology to monitor and sort ratings, post reviews and responses, ensure reviewer anonymity.
Feedback, beta testing, collaboration, endorsement
Survey authors, reviewers, readers, journal editors, preprint servers.
Collaborate with preprint servers, NIH.
Funding to develop training materials, algorithms, functionalities of preprint severs.