Organizer
Philip N. Cohen and Micah Altman
Website or social media links
Current stage of development
Ready to be implemented
Project duration
2 years
Update
Have you integrated any feedback received?
concept of a ‘market’ for peer review clearly resonated with a number of commentator which stimulated a number of reflections on the incentive structure for peer review, and the opportunities for exchanging peer review effort. Further, comments in a number of other projects suggested that market design and evaluation was an implicit (although perhaps unrecognized) theme of a number of proposed interventions.
How has your project changed?
Reflecting on the feedback, we have reframed IOTA to highlight the market design challenges common to many peer review projects, and produced a white paper describing the problem of market failure in peer review, providing a general checklist for developing replicable and measurable interventions to improve peer review, and illustrating this approach by applying the checklist to the proposed IOTA intervention:
We believe that a token exchange along the lines illustrated by IOTA is one promising approach to correcting the incentive structures for peer review. Moreover, we conjecture that the proposed checklist will identify the minimum necessary conditions for proposed interventions to make systematic progress in this area.
Project aims
Background information on current practices
To make timely, evidence-based decisions scientists and nonscientists alike need to be able to understand how an emerging result has been vetted — whether that result is disseminated in the form of a preprint, working-paper, news article, or journal article. And changes in the speed of and trust in both science and evidence-based policy creates an urgent need to develop more timely, transparent forms of peer review. However credibility has been a significant barrier: New publication initiatives struggle to attract qualified reviewers (and authors), and even established outlets are challenged to demonstrate that their review process is trustworthy: information about the vetting process for a venue remains difficult to find, daunting to compare at scale and impossible for anyone (with the possible exception of a journal’s editorial board) to fully assess. 1
This initiative proposes a solution to the open peer review start up and evaluation problem: It will establish a light-weight ‘token’ service to match trusted reviewers with vetted open science projects. This solution works through a combination of information technology and market design. The token service will provide a lightweight, well-known, reliable, efficient, and auditable mechanism for ‘banking’ future peer reviews, and for evaluating how effectively they are used. And the result will be to establish a social and economic ‘commons’ that will channel the desire of potential reviewers to have more control over their review efforts, and to have their work most effectively contribute to the advancement of science.
- See for example: https://scholarlykitchen.sspnet.org/2020/10/01/guest-post-on-clarifying-the-goals-of-a-peer-review-taxonomy/ ; https://www.nytimes.com/2020/10/30/science/diversity-science-journals.html ; https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0239518
Overview of the challenge to overcome
Peer review is locked up in journals; people who want to develop innovations don’t have a way of getting authors and reviewers to participate in a new system; lack of systemic data, including experimental data, to evaluate competing models. Need to establish trust for, and generate data from, innovative peer review projects.
The ideal outcome or output of the project
We see three user scenarios, each requiring its own interface.
Reviewer interface. Reviewers need a simple way to both defer current review requests and bank future reviews for a good cause. The interface allows reviewers to forward review requests they have declined to the peer review bank to automatically deposit a ‘token’. Token need reviewer metadata (e.g., ORCID) as well as specific options for reviewing (e.g., willing to conduct open review, willing to participate in review experiments, willing to have tokens listed publicly). Reviewers need to receive review requests and be connected to the appropriate reviewing platform, and have their token bank credited and debited.
Editor/publisher interface. Editors and publishers need to apply to the administrators to grant tokens, specifying quantity, time period, and area of expertise of reviews needed (e.g., subject area, review criteria). They need to be able to send review requests and have the token bank updated with request outcomes.
Public interface. The public needs to see information about the supported reviewer projects, including the nature of their reviewing process, and information about the banked tokens and reviewers. Potential authors look here to decide when and whether to submit their work for review (e.g., they might come here from a journal homepage that informs them of their participation as an IOTA-supported project). Includes statistics on performance and demography of the reviewer process.
Description of the intervention
The solution comprises three things:
- A minimal (MVP) template (protocol) for describing promises of future peer review — e.g. an ORCID, time window, and a set of disciplines (possibly automatically derived from the ORCID database).
- A service for administering these promises. A bank where reviews can be deposited, distributed, and redeemed; along with a mechanism for tracking use of the tokens (to advance understanding of open science and the service itself).
- A transparent contract that communicates the responsibilities of all parties: Reviewer pledges to be responsive to a future request for review of OA/open science work and absolved from doing a closed review now. IOTA service pledges to vet publishers with a principled and transparent process to ensure OA and scientific transparency. The publisher receives tokens, and pledges to sharing the information from their review and publication processes to advance the science of open science.
Reviewers gain a legitimate, prosocial way to decline reviews for journals and publishers whose practices they don’t support, and channel the demand for their reviewer time into open science projects. In other words, the reviewer market wants them to do a review, and instead of doing that (e.g.) Elsevier review, they contribute a token to IOTA. At the same time, they can notify the (e.g.) Elsevier journal (and, if they want, their own communities) of their decision to decline the review and instead contribute a review pledge to open science.
Plan for monitoring project outcome
Evaluation is a core goal of the project. First, the token service will be instrumented to provide metrics of the peer review process for those projects that use it; and to facilitate standardized surveys of reviewers and authors. Second, projects participating in the service will commit to making additional information operational statistics from their peer review process openly available in standardized form. Third, the project will commit to publishing open metrics on its own performance and those of participating projects.
What’s needed for success
Additional technology development
Development of a core service for banking reviews (depositing and dispersing tokens), supported by an analysis and reporting pipeline; and integration mechanisms (API’s, hooks, plugins) with researcher profile systems (primarily ORCID), manuscript management systems, and preprint hosting platforms.
Feedback, beta testing, collaboration, endorsement
- Pilot with small publication projects and generate feedback before scaling up.
- Attract endorsements and support from open science funders and societies, and others interested in developing trust and collaboration in new modes of peer review.
- Early adoption by (a) new peer review startups, (b) researchers conducting peer review experiments, (c) established publishers fielding new peer review platforms or experiments.
Funding
The proposed project is envisioned as a lightweight, tightly focused service, and in which most human effort is focused on platform development, and on managing relationships with open-science initiatives and projects.We estimate that roughly 1 FTE-year of development effort, plus approximately 1/2 FTE/year for relationship management and platform maintenance would be sufficient for a production service.
1 Comment