Earlier this week we hosted the #PreprintReviewChallenge, a collaborative session where we worked on comments and reviews of preprints. The event was part of Peer Review Week and our goal was to develop the largest collection of public reviews on preprinted research in a day, and of course, also to connect with others with an interest in this important activity.
During the event, we had 12 parallel sessions each focused on a preprint where participants could collaboratively work on the review. We also had an additional session available for anyone who wanted to work on a preprint of their choice or preferred to write their review autonomously. Here are some of the highlights from the event:
Collaboration is a strength. We had 66 attendees at the event. The number of participants in each session and the format they followed varied but generally participants liked the opportunity to discuss the research paper with others. Several of the groups continued the discussion after the event, adding further comments to the reviews and providing feedback on each other’s observations. We would like to thank each of the facilitators that helped guide the conversation and the development of the summaries and reviews for each paper.
Great presentations by Maurine Neiman, editor at Proceedings of the Royal Society of London B and Thomas Lemberger from EMBO Press and Review Commons. Maurine shared the great framework her team has developed to involve early-career researchers in the screening of bioRxiv preprints for suitability to invite a submission to Proceedings of the Royal Society of London B, and how they are using this activity to expand the scope of what they consider at the journal. Thomas gave a nice overview of Review Commons, a platform that allows authors to receive journal-independent reviews, and shared valuable tips on how to approach the review of papers in a manner that keeps focus on the content and the spirit of constructive feedback for the authors. Watch the presentations here:
Doing a review takes time. This will be no scoop to anyone who has done a review, but providing a thorough review on a paper takes time. We had a number of groups who mentioned that a little more time would have helped with the development of the review, we asked participants to use shared documents so that they could add further comments as needed and several groups are polishing their comments after the event. This is useful feedback, so that for any future activities done in this format we allocate time accordingly.
So how did we do in terms of numbers? We completed comments and reviews for 15 preprints; some of them have already been posted publicly (see here, here and here for examples of reviews posted), we are allowing those who wanted to polish reviews to add their finishing touches and we will be posting their contribution publicly over the coming days.
Reviewing a paper can be a lonely and, at times, slightly daunting exercise. We hope that the #PreprintReviewChallenge helped make the experience of reviewing a preprint that bit more enjoyable and collaborative. We thank all participants for their enthusiasm during the preparations and as part of the event, and we will look forward to continuing to support engagement around the review and evaluation of preprints.