How reviewers can use AI right now to make peer review easier

The academic peer review process has come under a great deal of scrutiny recently. The various merits and drawbacks of anonymous and double-blind review vs. open and public review have been discussed and debated on academic forums, in conferences and on Twitter. Leaving aside claims that the ‘Blockchain’ provides a panacea for resolving issues such as trust, bias, and academic misconduct in the peer review process, how can technology assist with the mechanics of actually reviewing papers for publication?

Academics often struggle to keep up with the latest research papers and to find time to read them all, and this also applies to peer review. While much has been written recently about how machine learning and AI might help automate parts of the peer review process from the point of view of publishers and conference organisers, less has been said about how the process can be made easier for reviewers themselves.

Proposals to apply AI to improving the review process include:

  • Automating the finding and selection of experts to assemble balanced program committees, to assign papers automatically to the right reviewers, and to calibrate reviewer scores to reduce bias [1]

  • Optimising editorial workflows [2]

  • Screening submitted manuscripts for plagiarism, incompletely described methods, invalid conclusions, and missing or synthesised data [3]

Some existing manuscript submission systems, such as Elsevier’s EVISE system, already perform some of these tasks.

However, beyond this, commentators have leaped towards a futuristic scenario where AI might automate the whole process through an omnipotent ‘RoboReviewer’ – despite the fact that the level of natural language understanding required for such a feat is many years away [4]. In the near term, it might at least be possible to verify citations automatically and find research closely related to the paper under submission [5]. But discussions remain largely hypothetical. Right now, what tools are available to reviewers that might at least relieve some of the burden, and reduce the amount of time taken to carry out a thorough peer review?

Recently, I was asked to review a submission for an eHealth journal. The ‘old school’ way of reviewing a paper, which I’ve used in the past, involves printing off the manuscript, separately printing out the figures and tables, and then flipping back and forth between sections, while taking marginal notes as I work through the paper. But this wasn’t feasible on this occasion. It was during a busy time for me, and I received the manuscript PDF while I was travelling and working in different locations, so I needed to be able to do the review electronically using the devices I had with me.

Having recently developed Scholarcy, which aims to help improve the document reading and understanding process, it seemed reasonable to ‘eat my own dogfood’ and use it to help me review the paper and write up my report. In this post, I describe how the process went. In the accompanying screenshots, I’ve redacted details to preserve anonymity, as the paper is still under review.

Scholarcy digests the author manuscript and creates an interactive flashcard with an overall summary, expandable/collapsible sections, figures, tables, and direct links to the cited sources. Normally, Scholarcy provides only snippets of each section of the original paper, but for peer review it has a ‘special setting’ that extracts the full text of each section into the flashcard. Reviewers can uncheck ‘extract snippets’ to enable this feature:

 
 Scholarcy’s ‘extract snippets’ setting - unchecking it allows processing of the full text of each section

Scholarcy’s ‘extract snippets’ setting - unchecking it allows processing of the full text of each section

 

Converting the manuscript into this flashcard format makes switching back and forth between sections much easier, particularly while reading and annotating the paper on your smartphone. This is also useful for reviewing figures, as you can click on the callout in the text to go straight to the figure, and then zoom in on the full-resolution image to check the detail.

 
 Scholarcy generates a review flashcard from the author’s manuscript (redacted to preserve anonymity)

Scholarcy generates a review flashcard from the author’s manuscript (redacted to preserve anonymity)

 

I found Scholarcy’s AI-generated summary particularly useful for comparing against the author’s abstract, which was more of a promotional piece, while the summary more accurately reflected the actual content of the paper. It provided me with a good overview of the author’s research, its methods and findings, thus setting the scene - and my expectations - before taking a deep dive into the full text.

The paper contained a lot of non-standard abbreviations, specific to the study. Normally I would need to write down the definitions or keep flipping back to the first occurrence as a reminder, but Scholarcy helpfully expands all abbreviations in the text, which made the reading and comprehension process much smoother. And the background reading list provided contextualisation for some terminology with which I was unfamiliar.

Scholarcy’s rich text notes feature supports copying and pasting phrases from the paper – including the generated hyperlinks to cited sources – which was helpful when clarifying my thoughts and creating the structure for my review.

Probably one of the most useful features for peer review is how Scholarcy generates direct links to cited sources from their callouts in the text. Now, if the paper was written in LaTeX, links to the relevant bibliography item may already be in place, but not links to the full text of the cited source itself. So being able to check the author’s citations while I read - without having to stop, copy and paste the reference into Google Scholar and hope for the best – was an absolute boon.

 
 Scholarcy creates direct links to cited sources (text redacted to preserve anonymity)

Scholarcy creates direct links to cited sources (text redacted to preserve anonymity)

 

Finally, Scholarcy exports the flashcard to Word so, in combination with your notes, you can edit this and use it as the basis for your review report.

Overall, Scholarcy provided an overview of the manuscript which I could easily read and explore on my phone while travelling, while allowing me to switch to a more in-depth view on my laptop for checking the details, citations, and writing up notes. As a result, I probably carried out a more thorough review in less time, as all the information I needed was at my fingertips.

Which tools do you use to help with the process of reviewing papers? How useful do you find Scholarcy for peer review? Perhaps you haven’t considered using it for this task yet. If you’re interested in trying out Scholarcy to help with the peer review process, and are willing to feed back on how useful you found it, please get in touch!

References

[1] Price S, Flach P (2017) Computational Support For Academic Peer Review: A Perspective from Artificial Intelligence. Communications of the ACM, March 2017, Vol. 60 No. 3, Pages 70-79. DOI: 10.1145/2979672

[2] Mrowinski MJ, Fronczak P, Fronczak A, Ausloos M, Nedic O (2017) Artificial intelligence in peer review: How can evolutionary computation support journal editors?. PLOS ONE 12(9): e0184711. https://doi.org/10.1371/journal.pone.0184711

[3] Sheridan (2017). Is There a Place for Artificial Intelligence in Your Peer Review Process?. Available from: http://www.sheridan.com/journals-on-topic/artificial-intelligence-peer-review

[4] Stockton N (2017) If AI can fix peer review in science, AI can do anything. Available from: https://www.wired.com/2017/02/ai-can-solve-peer-review-ai-can-solve-anything/

[5] Leetaru K (2018) Could AI Help Reform Academic Publishing? Available from: https://www.forbes.com/sites/kalevleetaru/2018/06/14/could-ai-help-reform-academic-publishing/