In this guest post, Michael Casp and Anna Jester look back at the ALPSP 2022 Annual Conference and Awards.
Peer review in the Digital Age relies heavily on email communications and web-based peer review platforms to contact reviewers and organize recommendations. In many ways, peer review today is just a recreation of the pre-internet mail, fax, and file cabinet template, but pasted online. With the current advancements in preprints, social media, and communication platforms, it is possible – even likely – that the model of peer reviewing, and the technology that supports it, have more evolutions to come.
As communication begins to move beyond
traditional text formats, so does content. Getting “beyond the PDF” has been a
staple at scholarly publishing conferences for years, and as it becomes more
commonplace, we are navigating its demands on the peer review process.
But it’s not just about technology. We must
also focus on developing and empowering the next generation of reviewers in
order to maintain a robust and sustainable reviewer pool. This happens through
teaching and developing early academics, which can offer a technology-versed,
diverse pool of reviewers that can continue to utilize the technologies we
develop.
At the ALPSP 2022 Conference we were treated to many innovative ideas that will
have a direct impact on peer review, potentially changing it and hopefully
improving it, for the researchers and publishers and ultimately providing
better value for society.
Beyond Email
Peer review’s reliance on email is a given. That is, unless you’re in China, where the app WeChat has in many ways supplanted email as the default communication system. Western publishers that seek to engage with Chinese researchers might struggle if they only use email. However, Charlesworth presented the ALPSP audience with a possible solution with their new product, Gateway. Gateway uses an API to allow journal-related communications to be sent to authors, reviewers, and editors via WeChat. This solution allows journals to meet Chinese researchers where they are, rather than trying to pull them into an (let’s face it, antiquated) email system.
Given this shift, eLife presented the ALPSP audience with Sciety, a new preprint evaluation platform that allows academics to review, curate, and share preprints and reviews with their peers. Preprint servers have also started to become social hubs for researchers to connect in more of a real time environment than traditional publishing tools. This system holds the promise of opening up peer review and publishing to a wider user base, allowing more people to curate and review research than ever before. The challenge presented by the scale of preprints is immense, and Sciety has the potential to reorganize how we deal with all this research in a social-focused way.
One more data point to mention: with the pervasive use of social media throughout the world, it is no surprise that academics would have their own version. Despite controversies, ResearchGate has maintained its position as the largest academic social network, connecting about 17 million users. With this many scholars connected, it’s possible we could see something like peer review networks emerge, though that doesn’t seem to be ResearchGate’s focus at the moment.
Beyond the PDF
A decade or so ago, getting “beyond the
PDF” was still a new idea being speculated about at conferences. It is now a
reality, with authors providing data sets, code, and detailed methods notebooks
alongside their traditional manuscripts. As a partner to those authors, we’ve
come up with ways to publish this content, but it can present special problems
during peer review.
For starters, journals that employ an
anonymized review model can find it quite difficult to extract identifiable
information from complex content like code or reproducible methods. Sometimes
an author’s institution might be inextricably linked to this content, making
anonymization impossible – or at least impractical.
Other forms of content, like ultra-high-resolution
images, can present logistical problems. Industry standard peer review
management systems have hard limits on the size and format of files they can
manage. For example, fields like pathology can rely on extremely detailed
images of microscope slides, and these multi-gigabyte files are hard to move from author, to
editor, to reviewer. Paleontology research can also require larger-than-usual
images, as sharing highly detailed photos of artifacts is crucial to the field.
Dealing with these kinds of challenges at the peer review stage can require a
lot of creativity and patience for all involved without a more flexible
solution.
Massive datasets can also present review challenges. Beyond the logistics of moving large files, there are also often more basic concerns, like is this data organized and labeled in a useful (and reusable) way? Is it actually possible to do a real review of a large dataset in the time that reviewers have to give to a paper? FAIR data principles are targeted at answering some of these questions, and services like Dryad and Figshare seek to help by curating and quality controlling datasets, ensuring they meet basic standards for organization, labeling, and metadata. But these services come with an additional cost that not everyone can bear. And a data review still depends on a reviewer willing to go the extra mile to actually review it.
Moving peer review beyond the PDF is still
a work in progress, but many of these are solvable problems as our technology
and internet infrastructure improve. Our J&J Editorial
staff regularly handle content like videos, podcasts, and datasets. At EJournalPress’, our platform is integrated
with third parties including Dyrad, Code Ocean, and Vimeo. These integrations
are an added convenience, as most journals and societies have to have direct
agreements with third parties for the integrations to be fully utilized. But we
often have to work around the peer review system, rather than with it, relying
on basic cloud file servers (e.g., Dropbox or OneDrive) instead of more
purpose-built technology.
Open/Pre-submission Review
Another decade-old conference trope was the
constant talk about new open peer review models. You might recall that people
were split on the wisdom of this approach, but the rise of preprints has done a
lot to push open peer review and pre-submission review into the limelight.
Organizations like Review Commons are working with EJournalPress to make pre-submission and
peer review a viable choice for authors by building review transfer
relationships with more traditional journals. The Review Commons model is to
take preprint submissions and have them peer reviewed. These reviews can then
be shared with participating journals if authors choose to submit. Journal editors
can use the existing Review Commons reviews to evaluate whether or not to
publish the work. In data presented at ALPSP 2022, manuscripts that came into
journals with reviews rarely needed to be sent out for further review. This has
many benefits, saving the editor time in soliciting reviews, and giving a
journal’s (probably over-taxed) reviewer pool a little break.
Review Commons is currently free for
authors, being supported by a philanthropic grant. It will be fascinating to
see if they are able to pivot towards a sustainable financial model in the
future.
We won’t exhaust you with the long list of
other open peer review initiatives, but suffice it to say, a lot of smart people
are working hard on making this a standard part of research communication.
Developing the Next Generation of Reviewers
None of what we’ve written so far will
matter one iota if there aren’t enough people in place to do the actual content
reviews. One of the interesting revelations we had while managing journal peer
review was the incredible range that exists in review quality. From the
in-depth, multi-page discussions of every point and nuance of an author’s
manuscript, to the dreaded “Looks good!” review, anyone in peer review can tell
you that we can (and must!) do a better job training our reviewers. We can
offer guideline documents and example reviews, but some people need a more
engaging approach to understand and deliver what editors expect.
It would be lovely if reviewing was a
required part of the science curriculum. It currently seems to happen in a
piece-meal, ad hoc fashion, many times driven by people who are willing to just
figure it out themselves. A more standardized approach is called for,
especially as reviewable content becomes more complex and varied.
One of the best examples we’ve seen of
reviewer training was actually a writer’s workshop for researchers wishing to
submit to a medical journal. The EIC of the journal
led this workshop, asking authors to submit their manuscripts ahead of time to
serve as examples for the workshop. During the workshop, the EIC talked through
several of these manuscripts, giving the authors invaluable feedback and what
amounted to a free round of review prior to the official journal submission.
Though this workshop was ostensibly for
authors, it was equally valuable for reviewers. Participants got to watch the
EIC go through a paper in real time, ask questions, pose solutions, and talk
through the subject matter with someone who had written hundreds and reviewed
thousands of manuscripts. This program has always stuck out as a great way to
train authors and reviewers, while also building the relationship between the
journal and its community. Win-win-win!
Formal peer review training benefits all parties, such as the Genetics Society of America’s program which also includes direct feedback from journal editors. If you’re thinking of implementing something like this, your organization may wish to conduct a pilot prior to a full rollout. Another great model for peer review training is to pair mentors and mentees to simultaneously provide training and increase the number of high quality trained peer reviewers in the field broadly. If your team is willing to study the results of your reviewer training efforts, be sure to submit them to the next International Congress on Peer Review and Scientific Publication so that we can all benefit from your findings.
Demographics and Diversity
Many of the journals and publishers we work
with are prioritizing diversity within their community by making efforts to
extend their reach to people who might have been historically left out of the
conversation. These organizations are also looking inward to seeing what their
current level of diversity looks like in order to improve it.
Many organizations have begun collecting
demographic information regarding their authors, reviewers, and editors. We
recommend a thoughtful approach when embarking on this project, as it can be
fraught with pitfalls and unexpected consequences if you don’t get it right.
Before your organization embarks on this endeavor, please consider best practices regarding data collection and clearly define the initiative’s
goals. Wondering where to start? Do a landscape scan of what other organizations aim to do with this data and please use standardized questions for self-reported diversity data collection.
Fortunately, many people are working on demographics data initiatives, and there is lots of support and ideas available from our community.
Summary
To put it mildly, there is a lot going on
right now. The technology we use has the potential to upend the traditional
research communication process, and in some cases (like preprints) it already
has. With a host of new data, content, and equity concerns, people involved in
the peer review process have more to deal with than ever before. And it’s unlikely
that we’re doing enough to equip them with the knowledge and training they need
to succeed. But we can do better, and I’m heartened to see the many people in
and around our industry who are trying to improve the situation. From our end, eJournalPress is supporting societies
and journals as they work to collect and evaluate demographic information and
metadata, and J&J Editorial staff
are always investigating ways to support journal innovations through a
combination of technology and experienced staff.
I often think about peer review in the
context of that old Churchill quote about democracy: “It has been said that
democracy is the worst form of government, except all of those other forms that
have been tried from time to time.” Peer review might not be the best method of
scientific evaluation, but it’s the best we’ve got, and who knows, maybe we’ll
make something even better. But until then, we’ve got work to do.
Anna Jester, VP of Marketing and Sales, eJournalPress, Wiley Partner Solutions
Michael Casp, Director of Business Development, J&J Editorial, Wiley Partner Solutions
Wiley Partner Solutions was gold sponsor of the ALPSP Conference and Awards held in Manchester UK in September 2022. The 2023 ALPSP Conference will be held in Manchester from 13-15 September 2023.
Wiley is one of the world’s largest publishers and a global leader in scientific research and career-connected education. Founded in 1807, Wiley enables discovery, powers education, and shapes workforces. Through its industry-leading content, digital platforms, and knowledge networks, the company delivers on its timeless mission to unlock human potential. Visit us at Wiley.com. Follow us on Facebook, Twitter, LinkedIn and Instagram.