Bold ideas and critical thoughts on science.

Rebecca Lawrence

Could this be the start of a new era in scholarly communication?

We need to learn from the practices during the corona pandemic to shape the "new normal" of scholarly communication instead of falling back into old patterns, says our Advisory Board member Rebecca Lawrence
23 June 2020

Could this be the start of a new era in scholarly communication?

For all the devastating impact that the coronavirus pandemic has had on us all, it has also shown how it is possible to swiftly change previously entrenched cultures and mindsets. Take for example the ability of a large proportion of the working population to now work just as effectively from home, without the usual significant travel and environmental impact caused by commuting and long haul flights to meetings and conferences around the world.

Meanwhile in scholarly research, there has been increasing pressure over the past couple of decades to rethink how we review and disseminate new research: to move away from the traditional journal model that has been the central pillar for much scholarly communication over the past 300+ years, and maximise the opportunities and potential that new technologies and approaches can bring.  Some progress has been made but the pace of change has been much slower than many had hoped. Could this pandemic be the trigger that finally enables a wholesale change in the way we conduct, communicate and discuss new research?

Accelerating research and its communication

The research world has shown that it is possible to make progress on drug and vaccine discovery [1] during this pandemic at speeds many multiples faster than previously achieved through the use of infrastructures to support sharing, reuse and collaboration around data; rapid publication and discussion/review tools (as discussed below); and through much greater transparency and accessibility of new research.

We have seen a big upsurge in the use of preprint servers [2] such as bioRxiv [3], medRxiv [4] and others, to rapidly share new insights into coronavirus. Such preprint servers (e.g. arXiv [5]) were originally developed to enable researchers to receive early feedback on articles prior to journal submission and to claim priority on the findings. During the coronavirus outbreak, preprints have been increasingly used as a way to quickly share new research prior to going through peer review so that other researchers in the field can quickly assess the outputs and, where appropriate, start to build on them without the normal delay (often months) awaiting formal journal publication.

Another new approach has been the development of Outbreak Science [6] (funded by Wellcome), a platform to enable researchers (public or anonymous) to provide a structural review of a preprint with the aim of providing initial community triage. A number of publishers and related groups have also come together to look at how they can maximise efficiency in peer review [7] for coronavirus-related research, to minimize direct peer review requests from already overworked coronavirus experts through effective triage prior to peer review, and to reduce re-review between publication venues to bring cost and efficiency gains to the system.

Speed must not come at the expense of trust

Some experiences of rapid publication during the pandemic have however highlighted a number of inherent flaws in these processes that simply cannot be ignored. Although most preprint servers include clear warnings that the preprints they host have not been peer reviewed, there have been a few unfortunate cases where lower quality preprints have been used to fuel fake news and to derail public debate [8]. Given the potential impact of this research on health, many preprint servers have now introduced additional checks [9] to help increase trust in the content to try to minimize the potential for misuse and misinterpretation of the findings being reported.

Furthermore most research on preprint servers does not include sharing of the underlying data, code and materials, minimizing the ability for full scrutiny (including peer review) of findings to support their reuse. Indeed, the problems with this have been brought into sharp focus with the recent retractions of two COVID-19 papers in the Lancet [10] and NEJM [11], as well as of a further preprint, all of which had started to influence treatment regimes. In each case the lack of access to the underlying data [12] to enable independent verification of the results ultimately led to their significant claims being brought into question. This highlights not only how crucial it is that such underlying data are made available for review, but also that peer review of research needs to be open and transparent, so that it is clear to all what level of expert review has taken place, by whom, and their comments.

There are in fact a number of research publishing models in widespread use that are designed precisely to enable rapid publication of new findings (as a preprint does) while assuring expert and transparent peer review to support trust in, and decision-making around, an article’s potential use. F1000Research [13] developed such a publishing model for the life sciences in 2013, with a mandatory requirement that the underlying data and code are made FAIR (Finable, Accessible, Interoperable and Reusable) to support reproducibility of the findings and their use and reuse.  In addition, publications can be updated as new data comes in or new understanding is developed, thereby enabling the publication to track the ongoing research workflow – like a ‘living article’.

This model is now being extended out to all research disciplines, and major funders around the world also now have their own publishing platforms for their grantees utilising this same rapid and transparent publishing model, including Wellcome [14], the Bill & Melinda Gates Foundation [15], the Irish Health Research Board [16], and later this year, the European Commission [17]. Indeed these platforms have seen a big upsurge in submissions on COVID-19 during this time due to the obvious benefits of this approach during such an emergency [for examples see 18, 19 and 20]. Furthermore, this model can bring considerable cost and efficiency gains: average article processing charges on Wellcome Open Research are 67% cheaper than the average Wellcome pays to other venues for Open Access [21], and the model enables the publication of a much broader range of outputs, helping to reduce research waste.

The tip of the iceberg

As the pace of research accelerates through the use of more automated workflows and AI-based approaches, speed and efficiency in scholarly communication and review are going to become ever more crucial. We need to think carefully about the role and value of peer review, and what type of review process is most relevant and beneficial in different circumstances and contexts. For example, some outputs may require simple checklists to ensure adequate reporting against community standards; some high-volume and highly structured outputs may be better reviewed through AI-oriented approaches followed by community review. We’re going to need to be smarter and more efficient about the use of the finite and ever-busier resource of practicing researchers in peer review.

There are many other ‘emergencies’ beyond the current pandemic that similarly warrant an escalation in speed and transparency through rapid communication of solid knowledge to help us tackle some of the world’s greatest challenges, such as climate change, or other diseases and disorders that impact people’s lives, from more prominent fields such as cancer and mental health disorders, to rare diseases. But why stop there?  Many fields beyond the health sciences also need such urgency – new innovations across the physical sciences and engineering, our understanding of sociology, geography, culture etc, all have their own significant impacts on human life and wellbeing, and deserve to benefit equally from rapid and transparent publishing approaches.

The start of a new era?

To transition more broadly to rapid and transparent publishing approaches, we will need a further cultural shift in how research and researchers are assessed and incentivized to enable them to utilise these new approaches to support them in working more efficiently and effectively. This has already been the focus of many major initiatives including DORA (the San Francisco Declaration of Research Assessment) [22] and the recent final report from the EC’s Open Science Policy Platform [23].

With research and higher-education institutions around the world facing a new stark reality following the coronavirus pandemic, now is the time to rethink how they engage in the scholarly communication of their researchers’ outputs, together with funders, policymaking organisations and research communities, to maximise the potentially significant cost and efficiency gains from the use of new tools and approaches.

The coronavirus pandemic has demonstrated the need and importance of having effective and collaborative ways of working and shown the value of processes and tools that support more rapid sharing and engagement with research. We need to learn from what has worked best when we were forced to focus on an urgent challenge, and don’t simply slip back into the old ways of doing things but rather make sure that we utilize all the best elements of these models to move to this as a ‘new normal’.

 

Author info

Rebecca Lawrence is Managing Director of F1000. She was responsible for the launch of F1000Research in 2013, and has subsequently led the initiative behind the recent launches of Wellcome Open Research, Gates Open Research, and many other funder- and institution-based publishing platforms that aim to start a new trajectory in the way scientific findings and data are communicated and ultimately research and researchers are evaluated.
She is a member of the European Commission’s Open Science Policy Platform, chairing their work on next-generation indicators and their integrated advice: OSPP-REC, and is a member of the US National Academies (NASEM) Committee on Advanced and Automated Workflows. She has been a co-Chair of a number working groups focussing on data and peer review, for organisations including the Research Data Alliance (RDA) and ORCID. She is also an Advisory Board member for DORA (the San Francisco Declaration on Research Assessment) and for the data policy and standards initiative, FAIRsharing. She has worked in STM publishing for almost 20 years for several publishers including Elsevier where she built and ran the Drug Discovery Group. She originally trained and qualified as a pharmacist, and holds a PhD in Cardiovascular Pharmacology.

Digital Object Identifier (DOI)

https://doi.org/10.5281/zenodo.3903502

Cite as

Lawrence, R. (2020). Could this be the start of a new era in scholarly communication?. Elephant in the Lab. DOI: 10.5281/zenodo.3903502

1 Comment

  1. I agree that crisis regularly increase the rate of change, especially if such (slow) change has been ongoing before. However, I doubt that efficiency and speed are the most important criteria in the publishing process. Quality check needs time and expertise of the reviewers. Faults from incorrect conclusions may cost society more than had been saved due to an ‘efficient scholarly communication’. Still, I think the common peer-reviewing-process urgently needs to be reshaped.

Continue reading