Intelligent crowd reviewing of scientific papers tested

Credit: Charles Rondeau/public domain

(—Online chemistry journal Synlett, which is published by Thieme, has tested the idea of intelligent crowd reviewing of scientific papers. The project was the brainchild of Benjamin List, a journal editor (and researcher with the Max Planck Institute for Coal Research), and his graduate assistant, Denis Höfler. They came up with the idea as an alternative to the traditional peer review process that is used by most journals prior to publishing work.

In order to have their work published in an esteemed , a research team (or individual) submits a to the journal along with associated references. Upon submission, an editor reads the work, and if they believe it is worthy of publication, they send it off to two or three designated peers for review. If, after reviewing the work, the peers also deem it worthy of publication, the paper is accepted and the editorial team goes to to get it ready for publication. But as many have noted, the peer review process is deeply flawed. Most glaring is the limited number of peers used. In this new approach being tested at Synlett, the number is increased dramatically.

List has spoken to the press about the endeavor, explaining how it works. First, it is not open season—a select number of are invited to participate in a closed forum environment. In the test case with Snylett, the number was approximately 100. Second, reviewers remain anonymous, allowing them the freedom to write anything they wish. Third, the reviewers are also allowed to add notes to the paper itself and are free to respond to comments and ideas made by other reviewers. The approach, List says, avoids many of the pitfalls of traditional peer review, such as wasted time ( in the Synlett experiment had just a few days to respond), limited reviewers, the need for editors to nag reviewers to get the job done, and issues with reviewer and researcher egos. He notes that other attempts at expanding peer review to a crowd have not fared well due to allowing anonymous, often unqualified, trolls to overwhelm comment sections.

The experiment at Synlett ran for most of last year, and List claims that it was a big success. Peers behaved themselves, acting professionally and responsibly, and the authors of the papers reported being quite pleased with the results. Nine out of 10 of the papers were approved for publication. The editors at Synlett were apparently pleased, as well, as they plan to expand testing of the idea.

More information: … ood-and-fast-1.22072

© 2017

Citation: Intelligent crowd reviewing of scientific papers tested (2017, June 9) retrieved 27 February 2024 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Scientists look to AI for help in peer review


Feedback to editors