A 3D rendered computer illustration of a network of human figures.

Having grant applicants become referees could help to address the peer-review crisis in research funding.Credit: Wong Yu Liang/Getty

Around 85% of researchers who applied for grants from a German funder have given the thumbs up to ‘distributed peer review’, a new process in which applicants are asked to review proposals by other researchers as a condition to having their own proposal considered.

The Volkswagen Foundation in Hannover, Germany’s largest research-funding organization, is testing the process as part of its ‘Open Up’ programme, which handles grant proposals of up to €400,000 (US$447,000) for groups of two or three researchers who work on humanities and cultural studies. In June, the foundation published the results of a survey in which researchers involved in the initiative expressed optimism about distributed peer review, despite the extra workload of having to referee each other’s grant proposals for free.

Around 77% of the survey respondents said they expected that the organization would find appropriate peer reviewers for their application — those that have the relevant knowledge, background and expertise to provide useful feedback. Roughly 74% said they trusted the process to be fair in giving funding to the best research, and 70% of respondents said they thought it would help to identify more adventurous grant proposals than those selected by the existing peer-review process, which is conducted by panellists appointed by the foundation.

The initiative comes at a time when finding volunteers to perform peer reviews is becoming increasingly difficult. This issue is made worse by the fact that a small number of researchers are often overburdened by a large number of review requests, because many qualified candidates are not contacted. Proponents of distributed peer review argue that the process might simplify the work of finding suitable reviewers, especially because there is an incentive for referees to participate to have their own work considered.

Testing the process

The European Southern Observatory (ESO) in Garching, Germany, has conducted a similar trial, using distributed peer review to assess applications for time slots to use its telescopes. Described in 2020, the trial1 found no difference between how often reviewers would agree with one another on their assessments of applications using distributed peer review compared with the conventional set-up that uses time-allocation committees.

As a result, the ESO formally implemented distributed peer review and is satisfied with the results, says Tereza Jerabkova, an astronomer at ESO’s Observing Programmes Office in Garching. “The main driver for doing distributed peer review is the number of applications we get per semester,” she says. “Distributed peer review is not a luxury, it’s a necessity.”

For the Volkswagen Foundation, the time-saving aspect of the process was important, says Hanna Denecke, who leads the organization’s exploration funding team, which aims to fund out-of-the-box, creative and daring research ideas. Denecke says the foundation receives up to 300 grant proposals for its Open Up programme each year, around 100 of which are typically shortlisted. Funding officers distribute these applications among a panel of eight specialists, appointed by the foundation, for review. Around 10–12 proposals end up getting funded each year, says Denecke. “It’s a lot of work for us here at the foundation, and also the reviewers,” she says.

For this year’s Open Up grants — applications for which closed at the end of August — the foundation decided to try out distributed peer review to see whether it would make the process more efficient. Around 80 of 140 applications received this year were shortlisted for external peer review by the panel, Denecke says. The same 140 proposals are also going through distributed peer review, wherein each co-author of each application is refereeing four or five other grant submissions.

Because each proposal is reviewed by ten researchers through distributed peer review, it is easier to spot people who might be trying to ‘game’ the system by deliberately giving negative feedback, Denecke says. She admits that this is a possibility, since the applicants are applying for the same limited number of grants.

Once the review process is over, the foundation will ask the researchers how time-consuming it was for them to review five papers each. The early signs of acceptance and optimism among referees are promising, says Denecke.

Denecke thinks that researchers are pleased with the idea of distributed peer review, partly because they will receive written feedback on their proposals — which they do not get from the panel-based process. It’s also possible that researchers are expressing optimism because the foundation plans to award double the number of grants this year as part of the trial, she says.

Risk of competition

Andrew Preston, who is based in London and co-founder of Publons — a website that allows researchers to claim credit for conducting peer reviews — agrees that distributed peer review might help to address the shortage of referees.

But, he notes that the process would expose researchers to each other’s ideas, which could prompt some to launch projects that they might not have thought of otherwise, without giving credit to the researcher with the original idea. It would also increase the number of hours that researchers collectively spend conducting peer review, he says. “If a system like this kind of worked, but let’s say doubled the number of hours that humans are spending reviewing these grants, is that a good trade-off?” he asks.

Denecke says her team is planning to conduct interviews with the grant applicants to understand how they feel about these issues. With regard to the workload, “I would say that the burden of review (as long as we decide to have external reviews of scientific work/grant proposals) has to be borne by someone,” she says. “In most cases, it is borne by reviewers who work on panels and review a large number of applications. In our case, for example, the reviewers appointed to the panel reviewed about 20 proposals. And it is becoming increasingly difficult to find reviewers for these panels.”

Denecke adds that the proposals that are assessed as part of the distributed peer review process are short, around four or five pages. “Distributed peer review might not be a good choice for larger funding programmes with higher funding volumes and longer proposals,” she says.

In reviewing the success of the trial and whether the foundation will roll out distributed peer review to its other funding programmes, Denecke says they will be looking at feedback from grant applicants about the feasibility, practicality and transparency of the process. She’s also keen to see if more diverse ideas and projects will be approved. “It will be interesting to see whether the process is able to identify these kinds of projects,” she says.



Source link


administrator