Thursday, 29 August 2019

1st Basel Sustainable Publishing Forum—Dialog with Learned Societies: Sustainable Solutions for Successful Transition to Open Access

Academic publishing has undergone many changes in the last two decades. Chief among these is the emergence of open access, which have greatly affected the delivery of research to readers and the status of authors, in addition to publisher revenue streams. The current publishing landscape can appear very complex, with a variety of strategies implemented to achieve open access. These include hybrid journals, depositing of draft or accepted versions of articles, embargo periods, levying article processing charges, or seeking benefactors to fund free-to-author open access options.

Learned societies are often relatively small publishers and can be under pressure during times of change. Many rely heavily on publishing revenue to fund valuable services for researchers in their field, and publishing often fulfils a key part of their mission to promote knowledge. Since the emergence of Plan S in Europe, there have been suggestions that some societies face closure if they are forced to convert to open-access-only publishing modes. Is this really the case or are there opportunities hidden within these daunting challenges that could be explored? What are learned societies most concerned about and where do they have the most to contribute to the new publishing landscape?

The MDPI Sustainability Foundation is hosting a forum to bring together representatives from the industry, including learned societies, librarians, and open access publishers, to survey the current status and explore challenges facing the sector. The goal is to ensure a vibrant and diverse landscape within academic publishing. The title of the day is the 1st Basel Sustainable Publishing Forum -Dialog with Learned Societies: Sustainable Solutions for Successful Transition to Open Access and we will hear from representatives of learned societies in order to understand the challenges they face to transition journals to Open Access. A number of publishing experts will also share their views on open access journals and how the current climate might affect learned society publishers. A key point will be how the requirements of Plan S will affect the stability of the research evaluation system, academic publishing, and researchers themselves.

The day will mix keynote talks with discussion sessions. Confirmed speakers include:

A representative of cOAlition S to present the details of Plan S, guidelines and timeline for implementation.

Alicia Wise who will present potential strategies and business models through which learned societies can transition to an open access landscape and adapt to Plan S. These recommendations arise from the consultancy done by Information Power, and commissioned by Wellcome, in partnership with UK Research and Innovation (UKRI) and the Association of Learned & Professional Society Publishers (ALPSP).

Jan Erik Frantsvåg will present a detailed overview and perspective on the current status of the open access landscape.

Saskia de Vries will present the principles of fair open access, in the context of Plan S and their relevance to societies.

The day will be opened by Antonio Loprieno, President of the Swiss Academies of Arts and Sciences, President of the European Federation of Academies of Sciences and Humanities, and former Rector of the University of Basel.

A full program and registration can be found online at https://sustainablesolutionstoopenaccess.sciforum.net

We are looking forward to a vibrant day of discussion and debate. The talks will be recorded and made available online after the event.

Spotlight on BMJ Best Practice and 67 Bricks - shortlisted for the 2019 ALPSP Awards for Innovation in Publishing

On 12 September, at the ALPSP Conference, we will be announcing the winners of the 2019 ALPSP Awards for Innovation in Publishing.  In this series of posts leading up to the Awards ceremony, we meet our finalists and get to know a bit more about them.


First of all we hear about BMJ Best Practice and 67 Bricks


BMJ Best Practice
BMJ is a healthcare knowledge provider that advances healthcare worldwide by sharing knowledge and expertise to improve experiences, outcomes and value. BMJ Best Practice is a generalist point of care tool particularly useful for junior doctors, multidisciplinary teams, specialists working outside of their specialty and GPs. It is uniquely structured around the patient consultation with advice on symptom evaluation, test ordering and treatment approach for over 1000 conditions across 30 specialties. BMJ Best Practice provides a rich source of expertise that healthcare professionals rely on every day.

67 Bricks are a software development consultancy who help publishers deliver information products for the data-driven world. We give publishers control, flexibility and agility so that they can deliver the compelling user experiences that their customers increasingly demand. Our custom-built solutions enable publishers to increase the value of their content, support existing and new business models, enable better reuse of content and deliver increased revenues from digital products.


What is the project that you submitted for the Awards?

At BMJ Best Practice we wanted to be more innovative with our product development pipeline and user experience in line with our users’ changing needs and expectations. Clinicians increasingly want concise answers at the point of care rather than long-form reference text, but our content was being created, stored and presented as monolithic articles typically extending to thousands of words. This made it difficult to improve the user interface and limited our ability to slice and dice content to power new products or to deliver our content to third parties for integration into granular software systems, for example Electronic Health Record (EHR) systems.

We therefore took a strategic decision not only to relaunch the product, but to build a completely new editorial production system, and crucially, to reinvent the underlying data structure of the content to make it more granular, flexible and reusable. To help us achieve this we partnered with publishing technology experts 67 Bricks. The success of the project has enabled us to satisfy current market demands, create new market opportunities, upskill our technology team and provides a springboard for further product innovation.



Tell us a little about how it works and the team behind it


The most visible outcome of the project is our relaunched point of care tool, BMJ Best Practice. It is significantly more user focused and we have introduced many new features. For example users now have the ability to:
  • switch languages
  • receive important updates on the latest evidence changes
  • access 400+ calculators
  • watch practical videos of common procedures.
We have seen a 202% growth in traffic, 39% increase in year-on-year revenue and a 95% retention rate.



In addition, our internal staff now have a much improved content production system which is responsive, collaborative, quicker to use and supported by a powerful faceted search function. As a result of this and our new ability to integrate with third parties we have been able to revolutionise our content translation process leading to an 80% increase in capacity and 40% reduction in costs.

The achievement we are probably most proud of is that our content is now structured and managed in a much more futureproof way. As such we are in a better place to continue to innovate our current and existing products, build partnerships and integrate with third party systems. A lot of intelligent work went on “behind the scenes” in order to achieve this, for example:
  • We created a new content model, composed of individually referenceable fragments of content (e.g. assessment, diagnosis), which was refined iteratively throughout the project. More standardised than the old model, it provides us the scope to reuse, repackage and serve up our content in different ways.
  • A Knowledge Base API was built to allow the new website to retrieve the content to power innovative front-end features such as enhanced search capability and medical recommendations.This API can also be used by BMJ or new partners to power future products. 
  • Content enrichment and entity recognition was used to add significantly more value to our existing content, for example to identify drug names and diagnoses.
The joint BMJ and 67 Bricks team behind the project consisted of technology, content, product and market experts. Agile methodology and the principles of user-led ‘Pragmatic Marketing’ were used to ensure that customer problems were addressed first and foremost. BMJ and 67 Bricks’ developers worked together in an Agile way, as one co-development team, developing, prototyping, testing and gathering and responding to user feedback ‘on the go’. This was facilitated through heavy use of online communication channels, daily joint stand-ups and reciprocal code review. The team were able to talk openly about how to implement features and a shared understanding of the business drivers meant that everyone on the team was working towards a common goal. Co-developing the solution in this way has ensured maximum knowledge transfer to BMJ’s developers who are now well placed to continue to maintain and extend Best Practice to meet future needs.

What are your plans for the future?


Longer term we are in a much improved position. Our content is now more flexible, granular and standardised, which means we have the opportunity to innovate and build commercial partnerships in a way we weren’t able to do before. 

 

The impact of the new, revamped product and the competitive advantages that our improved content and data capabilities have brought have already been significant. Our customer retention rate has exceeded 90% and we have 3 major product development projects in the pipeline. We are actively pursuing opportunities for Best Practice to link with Electronic Healthcare Records (EHR) and we have been able to tender for content related business opportunities that were previously out of reach, with clients such as NHS England, Wales and Scotland.
photo Chris Wroe

Chris Wroe, MB BChir is a health informatician at the BMJ ensuring BMJ’s healthcare content can be integrated into the heart of the clinical workflow. He is a qualified medical doctor, with 18 years experience in bio-health informatics and a special interest in biomedical ontologies.

photo Isaac Menso


Isaac Menso, is the Product Development Manager for the BMJ Knowledge Centre, working to create new products and develop existing products that solve market problems. He is passionate about building innovative digital solutions that help people.



photo Jennifer Schivas
Jennifer Schivas is Head of Strategy and Industry Engagement at 67 Bricks, a software development consultancy that help publishers deliver data-driven information products. She has previously held roles at Oxford University Press, Taylor & Francis and Intellect Books. 

Twitter: @BMJBestPractice and @67bricks
        
See the ALPSP Awards for Innovation in Publishing 2019 finalists at the ALPSP Conference on 11-13 September where the winners will be announced. 

The ALPSP Awards for Innovation in Publishing are sponsored by MPS Ltd.

Tuesday, 27 August 2019

Plan S and its Impact on the Scholarly Publishing Community


logo MPS
sponsor of the ALPSP Awards for Innovation in Publishing 2019 offers this insight into the impact of PlanS on the industry.


“Starting from January 1, 2021, all scholarly publications that result from research funded by public grants or private grants must be published in Open Access Platforms or Journals, or made available in Open Access Repositories without any embargo period.”

Gaining momentum and attracting feedback from researchers, publishers, funding organizations, and others involved in scholarly publishing, Plan S has been a topic of global debate since its launch in September 2018 and has led to a revised version released by cOAlition S on 31 May 2019.

If successfully implemented, Plan S will allow unrestricted use and access to publicly funded research and redefine the ways scholarly publications are published, read, and shared. Are you ready for this change?

The absence of a detailed structure and steps to achieve the desired outcome brings forth a lot of confusion. Various technical aspects of Plan S, including publications and transition agreements, are not well defined. Though Plan S implementation is supported by various organizations like the European Research Council (ERC) and other funding bodies, some publishers believe that it’s an impractical solution with no robust guidelines. This leaves researchers with meager or no additional financial support, and many publishers are concerned about the impact on future revenue.

A collective observation of the feedback and comments received on the Plan S implementation guidance reveals that individually and organizationally many support Plan S’ goals. Agreeing to the broader perspective, publishers across the globe are prepared to create a sustainable and “open accessible” scholarly publication system.

So what does it mean for different stakeholders?

The noble cause of making publicly funded research freely available to the public is generally feasible for researchers who are supported by grants for their research and allied expenses. However, it has been noted that not all research streams are uniformly assisted by funding agencies, and thus burdening the publishers of less funded streams even further.

Publishers have major concerns about the prohibitions by Plan S against publication in hybrid journals and various other subscription models. This occurs as a major downside, especially for small- and medium-sized scholarly publishers. They anticipate that the projected revenue loss will have a more debilitating effect.  Additionally, it increases the need, to put it in its best light, to innovate in terms of finding sources of additional revenues for publishers already struggling to make ends meet.

With an objective to establish publishing and content sharing in the era of OA, platform providers offer solutions that cater to the entire publishing life cycle. These platforms are flexible and agile, and can be customized to suit the requirements of modern publishers. Backed by an innovative and knowledgeable approach, these solutions offer customer-centric interfaces and superior operational experience to publishers and authors. As an example, with guidelines, processes, and pricing in a transparent Platform as a Service environment, APC’s pricing can be kept transparent as authors pick and choose from the authoring services and tools affordable to them.

About MPS Limited

logo MPS
platforms help solve complex business problems through intuitive and scalable technology. Our Platform as a Service (PaaS) offerings are powered by domain expertise, scalable architecture, and highly secure hosted environments. These offerings include Content Workflow and Production, Content Management, Hosting and Delivery, Usage Analytics, and also Custom Development. Strengthening and innovating our platforms in line with OA objectives, we also cater to the entire publishing value chain, starting right from manuscript submission through production to content delivery.


See the ALPSP Awards for Innovation in Publishing Finalists lightning session at our Annual Conference on 11-13 September, where the winners will be announced.

Tuesday, 20 August 2019

Spotlight on Ripeta - shortlisted for the 2019 ALPSP Awards for Innovation in Publishing

On 12 September we will be announcing the winners of the 2019 ALPSP 
Awards for Innovation in Publishing.  In this series of posts leading up to the Awards ceremony we meet our four finalists and get to know a bit more about them.  
logo Ripeta
First question has to be: Tell us about your company

Ripeta was founded by a team of three science-related experts who joined forces whilst working at Washington University in St. Louis. Leslie, Anthony and Cynthia, like many of their colleagues, were increasingly burdened with the task of improving science without having the resources to do so. They found that much of the curated data they produced at a data center were used but was not cited. 


They quickly became frustrated and saw the need for a tool and practices to remedy the situation, and the idea for Ripeta was born. The Ripeta software was developed to assess, design, and disseminate practices and measures to improve the reproducibility of science with minimal burden on scientists, starting with the biomedical sciences. In 2017, Ripeta launched its first alpha product.

What is the project that you submitted for the Awards?

Main features and functions of Ripeta

Ripeta is designed for publishers, funders, and researchers. We provide a suite of tools and services to rapidly screen and assess manuscripts for the proper reporting of scientific method components. These tools leverage sophisticated machine-learning and natural language processing algorithms to extract key reproducibility elements from research articles. This, in turn, shortens and improves the publication process while making the methods easily discoverable for future reuse.

Ripeta Software:  Our software is built on a reproducibility framework that includes over 100 unique variables grouped into five categories highlighting important scientific elements and creating a report: 
  1. Study Overview
  2. Data Collection
  3. Analytics
  4. Software and Code
  5. Supporting Material (new)
This report is generated in seconds providing immediate feedback.

image ripeta report


















Portfolio Analysis:   While many publishers have adopted checklists to evaluate reproducibility reporting criteria, the standard practice is for editors or reviewers to manually assess document adherence to guidelines. This review process is neither quick nor consistent. Ripeta offers a critical addition to the scientific publishing pipeline while significantly reducing the time to review and assess manuscripts.   We look across a group of articles based on your criteria and provide an overview, comparisons, and suggested improvements. Do you want to know how your journal or organization is performing overall? Want to know the reporting practices of your grantees? These reports provide insights into reproducibility practices.
graphic Portfolio analysis

Tell us about the team 

Dr. Leslie McIntosh, PhD is the founder and CEO. She has led multi-million dollar projects building software and services for data sharing and reuse. Leslie has a focus on assessing and improving the full research cycle and making the research process reproducible.


Anthony Juehne, MPH is the Chief Science Officer with speciality skills in epidemiology and biostatistics. His current work focuses on developing best-practices for conducting and reporting clinical research to enhance reproducibility, transparency, and accessibility.



Cynthia R. Hudson Vitale, MLIS is the Chief Information Scientist. She has worked with faculty on projects to facilitate data sharing and interoperability while meeting faculty research data needs throughout the research life-cycle. Her current research seeks to improve research reproducibility, addressing both technical and cultural barriers.

In what ways do you think Ripeta demonstrates innovation?

Research reproducibility is increasingly important in the scholarly communication world, yet researchers, publishers, and funders do not have a streamlined method for assessing the quality and completeness of the scientific research.

Ripeta aims to make better science easier by identifying and highlighting the important parts of research that should be transparently presented in a manuscript and other materials.  By detecting and predicting reproducibility in scientific research we provide a “credit report” for scientific publications. Our aim is to improve science and ensure resources are well spent, and we offer a pre peer-review to a paper and report post-publication.

The Ripeta solution is unique because we have identified the components across scientific fields necessary to responsibly report a scientific process. We are using machine learning and natural language processing to programmatically extract these components from scientific manuscripts and present them in a user-friendly report.

Ripeta helps save time, money, and improve reputations.

For publishers, Ripeta offers a quick assessment of a submitted manuscript. It’s hard to find reviewers and the reviewers are experts in specific areas. Ripeta allows a rapid check of important elements that should be in the manuscript and presents a report for editors and reviewers. This is measured in time to complete a pre-review.

For researchers, Ripeta offers a means to improve the manuscript. By rapidly reviewing the manuscript, the report will highlight which elements are missing and which elements have been found in a machine-readable manner. This allows researchers the opportunity to improve their manuscript before submission. This is measured through Ripeta report completeness from first to final submission.

What are your plans for the future?

The Ripeta go-to-market strategy is focused on developing tools for a subscription-model compatible with the needs of publishers and funders where users can assess a single publication. Subscribers will be charged a fee per report with an optional Ripeta software enterprise edition for robust analytics and bulk manuscript evaluation. We are currently engaging and conducting pilots with multiple publishers, universities, and researchers.

Our long-term goals include developing a suite of tools across the broad spectrum of sciences to understand and measure the key standards and limitations for scientific reproducibility across the research lifecycle and enable an automated approach to their assessment and dissemination.
  • Enable researchers to upload a single manuscript at a time at no charge;
  • Work with  Pre-print services (e.g., bioRxiv), who could charge to have a ripetaReport linked to the pre-print; and,
  • Work with large research and development firms, who would purchase enterprise installations for private hosting.
website: https://www.ripeta.com/
twitter: https://twitter.com/ripeta1

See the ALPSP Awards for Innovation in Publishing Finalists lightning sessions at our Annual Conference on 11-13 September, where the winners will be announced.

The ALPSP Awards for Innovation in Publishing 2019 are sponsored by MPS Ltd.







Tuesday, 13 August 2019

Spotlight on preLights - shortlisted for the 2019 ALPSP Awards for Innovation in Publishing

We will be announcing the winner of this year's ALPSP Awards for Innovation in Publishing at the ALPSP 2019 Conference.  In this series, we meet the finalists...



logo preLights
In this post, we speak to Claire Moulton, Publisher at The Company of Biologists, and Mate Palfy, Community Manager for preLights.

Tell us a bit about your organisation.

The Company of Biologists is a not-for profit publishing organisation dedicated to supporting and inspiring the biological community. We publish five journals in the life sciences, we host workshops and meetings and provide a wide range of charitable grants. Community initiatives are important to us, for example we support a long-standing blog (the Node) for the developmental biology community, and our latest community project launched in February 2018 is preLights. preLights is just 18 months but has already gained significant name recognition in the biology community.

What is the project/product that you submitted for the Awards?

We submitted preLights, which is a community platform for preprint highlights. An early-career team selects preprints of interest across the biological sciences, provides relevant comment, and engages authors in further discussion. The preprint highlight and comments are freely available on https://prelights.biologists.com/ .

Tell us a little about preLights and the team behind it.

At the heart of preLights is a team of early-career researchers (called ‘preLighters’) who come from four continents and work in many different research fields. They select which preprints to feature and then highlight the key findings of the preprint. These highlights are somewhat similar to ‘news and views’ articles in that they give a background to the topic and summarize the results in the context of the literature. But preLights posts also have some unique features. For example, the preLighters give their personal opinion on the preprint and directly question preprint authors about their work. The resulting discussions are published at the end of the article – we know that authors find the discussion process useful, as some have provided feedback on resulting revisions to their preprints/ published articles.

A dedicated community manager helps build the community of early-career researchers around preLights, provides them with support, and is involved in evolving and promoting this initiative.

graphic preLights

In what ways do you think it demonstrates innovation?

So far there has been very little public commenting happening on preprints, even though preprints could open up discussion of non-peer-reviewed research. preLights promotes such discussions by getting young scientists to write about the work and engaging preprint authors. Therefore, as the first platform of its kind, we believe preLights will change the way in which scientists engage with preprints.

preLights is also innovative in that it builds on a community of early-career researchers, who are often not asked directly by journals to take part in peer-review. preLights gives them an opportunity to hone their scientific writing skills, helps build their profile and credibility, and at the same time harnesses ideas from them for extending the product.

What are your plans for the future?

We have just launched a new feature on the preLights website called preLists in order to further help scientists navigate the preprint literature. These curated lists of preprints follow two main themes: preprints on a specific topic (e.g. CRISPR technology) or preprints which have been presented at a given scientific meeting. Following feedback from the community, we are now planning to make the creation of preLists open to any scientist.

We plan to expand preLights posts into new areas; for example now that medRxiv has launched we expect to have more team members covering biomedical fields.

We also plan to utilize preLights as a platform to provide educational insights into the peer-review process. The posts automatically link to the published version of the article, and we are planning to engage preprint authors to comment on the most important parts of their paper that changed during the peer-review process. We believe this can serve as a useful teaching resource for young scientists and help open up the ‘black box’ of peer-review.

Websites:
https://prelights.biologists.com/
https://prelights.biologists.com/prelists/

Twitter handles:
@preLights
@Co_Biologists

The ALPSP Annual Conference and Awards 2019 will be held at the Beaumont Estate, Old Windsor, UK from 11-13 September. #alpsp19
The ALPSP Awards for Innovation in Publishing 2019 are sponsored by MPS Ltd.

Wednesday, 7 August 2019

Spotlight on Scite - shortlisted for the 2019 ALPSP Awards for Innovation in Publishing

On 12 September we will be announcing the winners of this year's ALPSP Awards for Innovation in Publishing.  In this series of posts, we meet the finalists to learn a little more about each of them.

In this post, we hear from Josh Nicholson, co-founder and CEO of scite.ai

Tell us a little about your company

logo scite
The idea behind scite was first discussed nearly five years ago in response to a paper from Amgen reporting that out of 53 major cancer studies this company tried to validate, they could only successfully reproduce 6 (11%). This paper sparked widespread media coverage and concern and since then this problem has come to be known as the “reproducibility crisis.” While this paper received the most attention, perhaps because the numbers are so dire, it was not the first or only paper to reveal this problem. Indeed Bayer had reported similar findings in other areas of biomedical research, while non-profit reproducibility initiatives revealed the problem in psychology and other fields, suggesting a systemic issue. This is worrisome, to say the least, because scientific research informs nearly all aspects of our lives, from how you raise your children to the drugs being developed for fatal diseases and if most work is not strong enough to be independently reproduced, we are wasting billions of dollars and impacting millions of lives. scite wants to fix this problem by introducing a system that identifies and promotes reliable research.

We do this by ingesting and analyzing millions of scientific articles, extracting the citation context, and then applying our deep learning models to identify citations as supporting, contradicting, or simply mentioning. In short, allowing anyone to see if a scientific article has been supported or contradicted.

As a funny, aside my co-founder, Yuri Lazebnik, and I first proposed that someone else, like Thomson Reuters, Elsevier, or NCBI should implement the approach used now by scite. After some waiting, we realized that if we wanted it to exist we would need to build it ourselves and here we are, five years later, with over 300M classified citations citing over 20M articles!


Tell us a little about how it works and the team behind it

As mentioned, scite is a tool that allows anyone to see if a scientific paper has been supported or contradicted by using a deep learning model to perform citation analysis at scale. In order to do this, we need to first extract citation statements from full-text scientific articles, which in most cases means extracting citation statements out of PDFs. To accomplish this, scite relies upon 11 different machine learning models with 20 to 30 features each. This is very challenging as there are thousands of citation styles and PDFs come in a variety of different formats and quality. We’re fortunate to have Patrice Lopez on the team, who has been developing the tool to accomplish this for over ten years. Once we’ve extracted the citations from the articles, we use a deep learning model to classify citations as supporting contradicting or mentioning.

To show the utility of the tool, I like to show my PhD research as it is seen with and without the lens of scite. This study looked at the effects of aneuploidy on chromosome mis-segregation, that is, if you add an extra chromosome to a cell does it make more mistakes during cell division. Our work was published in eLife and was a collaboration between our lab at Virginia Tech, and labs in Portugal, and at the NIH. It has been cited 40 times to date and viewed roughly 4,000 times. In general, these features are what we as a community look at when assessing a paper–who the authors are, the prestige of the journal it appears in, affiliations, and some metrics like citations and perhaps social media attention (Altmetrics). This information is used to decide if we want to read or cite a paper, if we want to promote this author, join their lab, or give them a grant. These are our proxies of quality. Yet, none of them have anything to do with quality. With scite, in just a few clicks, you can see that my work has been independently supported by another lab (i.e. it has a supporting cite). To me, this is something like a super power for researchers, because without scite one would need to read forty papers to find this information or consult an expert and even then they might miss it.
screenshot demonstrating cite



To make scite happen requires a special team and I believe that is what we have created and continue to create at scite. I like to joke that scite is a multinational corporation with offices in Kentucky, Brooklyn, France, Germany, and Connecticut. While true, it is not an entirely accurate representation of the company, just as citation numbers are not an entirely accurate representation of a paper. In fact, scite is a small team of six scientists and developers united not by geography but by a passion to make science more reliable. 

In what ways do you think it demonstrates innovation?

The idea behind scite has been discussed as early as the 1920’s, as there exists a similar system in law called Shepardizing (lawyers need to make sure they don’t cite overturned cases  as they will quickly lose their argument this way). However, despite such discussions happening nearly a hundred years ago and multiple attempts to bring something like scite to fruition, even by juggernauts like Elseiver, it did not happen until scite came to life. scite is innovative in that in unlocks a tremendous wealth of information by successfully pushing the latest developments in technology to its limits. With that said, there is so much that we still need to do and we’re excited about the future and working with many stakeholders in the community.

What your plans for the future?

In the near future, we think anywhere there is scholarly metadata, there is an opportunity for scite to provide value. We are working with publishers to display scite badges, citation managers to display citation tallies, submission systems to implement citation screens and are in discussions with various pharmaceutical companies to help improve the efficiency of drug development. Moreover, we will start to expand out citation analytics from articles, to people, to journals, and institutions.

Longer term, we envision scite as being the place where people and machines go to identify reliable research and researchers. We have plans to explore micro-publications, so as to offer more rapid feedback into our system, plans to further invest in machine learning to see if we can predict citation patterns as well as promising therapeutics in drug development, and I think much more that we can’t even predict right now. The scientific corpus is arguably the most important corpus in the world. It’s a shame that it is easier to text mine twitter than it is cancer research. However, it’s also an opportunity, one which we’re seizing now.

photo Josh Nicholson
Josh Nicholson is co-founder and CEO of scite.ai, a deep learning platform that evaluates the reliability of scientific claims by citation analysis. Previously, he was founder and CEO of the Winnower (acquired 2016) and CEO of Authorea (acquired 2018 by Atypon), two companies aimed at improving how scientists publish and collaborate. He holds a PhD in cell biology from Virginia Tech, where his research focused on the effects of aneuploidy on chromosome segregation in cancer.

Websites
https://scite.ai/
Chrome plugin: https://chrome.google.com/webstore/detail/scite/homifejhmckachdikhkgomachelakohh
Firefox plugin:
https://addons.mozilla.org/en-US/firefox/addon/scite/

Twitter:
@sciteai

See the ALPSP Awards for Innovation in Publishing Finalists lightning sessions at the ALPSP Conference on 11-13 September. The winners will be announced at the Dinner on 12 September.

The ALPSP Awards for Innovation in Publishing 2019 are sponsored by MPS Ltd.