Showing posts with label academic publishing. Show all posts
Showing posts with label academic publishing. Show all posts

Thursday, 19 July 2018

Business Models for Open Access: How can I run a successful Open Access journal?


JAMS logo

In this week's guest blog Martyn Rittman, Ph.D, Publishing Services Manager at MDPI, offers some words of wisdom for developing successful open access journals.


The Directory of Open Access Journals contains over 11,500 journals and more than 3.1 million open access articles. Our indexing database Scilit contains around 20 million freely available articles, mostly open access. Estimates put the amount of open access in the region of 15% to 20% of all published articles. Do these numbers represent a threat to traditional revenue channels, or is it possible to run a healthy business using this model?

MDPI started publishing free online articles in the late 1990s. At first, we were supported by other projects, conferences, grants, and a great deal of voluntary time. In the mid-2000s, along with other publishers, we adopted author-side charges for publication, commonly known as article processing charges (APCs). By separating the journal editors making final acceptance decisions from the publisher, we have been able to maintain a rigorous and objective peer review process alongside gold open access. However, we have spoken to other publishers who have found it difficult to adopt the open access model, don’t feel they have the expertise, or find it difficult to cover their costs. Here, we offer some advice for developing successful open access journals.


Sources of revenue

Assure yourself that revenue streams for open access are available, even in fields with high scepticism towards author-side charges.  An increasing number of national funding agencies and governments have open access mandates, and offer support for the payment of APCs. National agreements with publishers are also emerging. Many non-governmental funding agencies and university libraries have also embraced open access and provide assistance to authors. These include the Wellcome Trust, the European Union, the Bill and Melinda Gates Foundation, and many more. A useful resource to see the amounts paid by universities for open access publication is the OpenAPC platform (https://treemaps.intact-project.org/apcdata/openapc/). Other models include Knowledge Unlatched for humanities and SCOAP3 for high energy physics, where publishers receive a per-article payment out of a central fund collected from funders and libraries. Smaller journals may be able to find a single funding agency, university or society to cover all the costs of the journal, especially in niche fields. In fact, there are increasing opportunities that do not involve directly invoicing authors.

Providing a useful service

Do not assume that open access is enough. Look carefully at the scope of your journal to see whether it offers something unique in the field. This is especially critical for new journals. For many authors, the decision on where to publish is not primarily linked to open access: the scope, editorial board, and reputation of the journal are usually more important. Open access journals should be focused on providing a good service to authors and you can distinguish your journal simply by providing a better alternative to existing journals.

Workflows

Consider new workflows for your journal. There may be an initial cost to making changes in how you run the journal but it will pay off in the long-term. Authors publishing in open access are often looking for a quick decision and publication. This might mean revisiting how editorial decisions are made, and changing expectations among editors, editorial board members and reviewers about how quickly they provide feedback. On the marketing side, you will need to consider how to better reach your target authors, redirecting efforts from potential subscription customers. If you opt for an APC model, handling a larger volume of small payments may require a new approach to invoicing.

There is no magic formula for running an open access journal and much of the work is the same as for traditional journals. Open access journals now exist in all fields using all kinds of editorial and business models. At MDPI, we continue to see growth in the open access market across many research fields. We are convinced of the benefits of universal access through a large, broad readership, allowing ideas to shape those outside of the academy as well as authors from institutions with small subscription budgets. Open access supports the dissemination and sustainability of knowledge and we encourage all publishers to take advantage.

Head and shoulders photo of Martyn Rittman
Martyn Rittman, Ph.D. is Publishing Services Manager at MDPI, combining a passion for open access publishing with an interest in new models for publishing and open science. He joined MDPI in 2013 following a research career covering physical chemistry, materials science, instrumentation, and mathematical modelling.


MDPI is headquartered in Basel, Switzerland, with branch offices in China, Spain, and Serbia. It runs over 200 fully open access journals, including some in collaboration with scholarly societies, and in 2017 published over 35,000 peer-reviewed articles. MDPI also provides publisher services through its JAMS software (jams.pub) and offers academic communication tools, including a conference management platform, at sciforum.net.

JAMS website: http://jams.pub
MDPI website: http://www.mdpi.com/publishing_services
Twitter: https://twitter.com/MDPIOpenAccess
Facebook: https://www.facebook.com/MDPIOpenAccessPublishing/


MDPI are proud silver sponsors of the ALPSP Annual Conference and Awards 2018





Wednesday, 30 May 2018

Examining Trust and Truth in Scholarly Publishing

In this latest blog, Helen Duriez, from our Professional Development Committee, reflects on how our current webinar series Trust, Truth and Scholarly Publishing webinar series came together.  


Oh, how the world turns. I used to think that Donald Trump running for US president was a fine joke. I used to think there was no way the UK would choose to go it alone when it could be a part of the collective economic might of the European Union. Turns out, the voting public in the US and UK had very different ideas to those of this naïve millennial back in 2016.  

Two years on, it’s become apparent that a large part of the success of these two major political campaigns was their ability to leverage personal belief systems. People are more likely to believe what they read if it aligns with their pre-existing belief system or if it taps into a feeling of existential threat, causing them to disregard evidence to the contrary. Ironically enough, there’s research that backs up this theory, and the concept even has a name – post-truth. You might have heard of it.

Now, what people choose to believe (or not) is tightly interwoven with what we choose to tell them, and how. In scholarly publishing, most of our jobs involve disseminating complex information in one form or another. With research output higher than ever before, there’s a lot of complicated stuff to explain – not just to academics and practitioners, but to the general public as well. Scientists are used to working with ambiguities, although that doesn’t mean they always navigate the rocky terrain of uncertainty safely. And what about lay audiences, who give as much weight to opinions as to facts?

The team at ALPSP felt this topic warranted further exploration, and so a small group of staff and volunteers (I’m one of the latter) have taken it upon ourselves to put together a series of webinars looking at some of the issues and opportunities in scholarly publishing today. Here’s how the series pans out…

Publishing without perishing

In case you missed the first webinar in the Trust, Truth & Scholarly Publishing series, go – sign up and download it. Seriously, do it. Yes, as one of the organisers I may be a little biased, but even knowing what I was about to listen to didn’t stop me from being motivated and left feeling a little awe-inspired as Richard Horton gave us a passionate, powerful reminder of what early journal publishers set out to achieve, and the obligations we still have to society today. Jason Hoyt follows up with some practical thoughts about how publishers can succeed in a post-truth world.

The reproducibility opportunity

In last week's webinar, now available for download too, and highly recommended,  Catriona Fennell, Rachel Tsui and Chris Chambers explored how the concept of reproducible science represents an opportunity, rather than a threat, when it comes to getting to the truth, the whole truth, and nothing but the truth. The traditional journal publishing model doesn’t have much time for replication studies (not original research, don’t ya know) or registered reports (findings, please!), but things are starting to change…

Public engagement with scholarly research

The process of communicating a new piece of scientific research to the world can sometimes feel a little like a game of Chinese whispers. When the description of a complex concept or process is shortened and reworded in order to reach a new audience, it’s meaning can change subtly. I’ve seen more than one twitter spat debating the latest “scientists have found…” health fact, and there are those who have built careers around addressing some of these misrepresentations.

So, what tools can those of us in scholarly communication use to instil trust in our content? In our last webinar we are joined by three industry communicators Tom Griffin, John Eggleton and Eva Emerson to find out.  You can register here for this final webinar.

For more practical information on the series, including how to get a members’ discount, see here.

Helen Duriez is a Product Manager at Wiley, specialising in digital strategy and planning. With over 12 years’ experience in the publishing industry, Helen has previous worked at the Royal Society, Macmillan and OUP. She gets out of bed for open science and avocado toast.



Wednesday, 3 May 2017

Data challenges for publishers – teams, tools and changes in the law



We are delighted to be able to share this blog from Warren Clark at Research Information who attended our popular, recent seminar How to Build a Data- Driven Publishing Organization chaired by Freddie Quek.

Dealing with data is nothing new to scholarly publishers – but it was clear from a recent ALPSP event that it’s an ever-changing battlefield, reports Warren Clark

How to Build a Data Driven Publishing Organization, held on 20 April at the Institute for Strategic Studies in London, and hosted by ALPSP, proved there is much for many still to learn in how to approach the masses of data points generated by companies throughout the publishing cycle.

As John Morton, board chair of Zapaygo, said in his keynote: ‘Most publishers are using less than five per cent of the data they own.’

The event featured many examples of areas in which data could be collected, analysed and presented in a form that would improve profitability for publishers, and provide users with a more personalised experience.

Ove Kähler, director, program management and global distribution at Brill, together with his colleague Lauren Danahy, team leader, applications and data, explored the challenges they faced in developing an in-house data team. Their most significant innovation was to arrange their primary data groups according to where they occurred in the workflow: content validation; product creation; content and data enrichment; content and data distribution; product promotion; and product sales.

The pair explained how they created a team – from existing staff within the company – giving each specific responsibility for one of those data groups, and how that led to improved quality and output of data at each step.

Indeed, the notion that publishers shouldn’t assume that dealing with data means employing new staff was echoed throughout the day, with both David Smith, head of product solutions at IET, and Elisabeth Ling, SVP of analytics at Elsevier, suggesting in the panel discussion that people ‘look at your own team first’, since it was likely that the skills required would already be present.

 

Choosing tools


As well as who and why, many speakers talked about how they capture, store, analyse and visualise the data they collect. The most extensive of these was IET’s David Smith, who overhauled the IT department’s software tools to evolve a more accurate suite of visualisations that product teams could use independently and without the need to continuous IT support. Smith explained that those looking for a ‘single solution’ from a software package that solved all data challenges for publishers would be disappointed, before reeling off half a dozen or more software tools that his team had integrated to develop a solution that suited their needs.

In a session that brought a perspective from outside the publishing industry, Matt Hutchison, director of business intelligence and analytics at Collinson Group, a company that runs global loyalty programmes on behalf of major brands, supported this notion by showing how they had outsourced some of their function to Amazon Web Services (AWS). Matt Pitchford, solutions architect at AWS, demonstrated that the cloud computing set-up they developed for Collinson Group involved more than 20 different pieces of software.

 

What data can bring


Another theme was quality of data – as Graeme Doswell, head of global circulation at Sage Publishing put it: ‘You need your data capture processes to be as granular as you want your output to be.’ He showed examples of how Sage was using its data to show librarians their levels of usage, making it easier for the sales teams when it came to renewals. David Leeming, publishing consultant at 67 Bricks, gave a further example, specifically in the area of content enrichment.

For Iain Craig, director strategic market analysis at Wiley, data was used to help business decisions on new journal launches. He explained a major project that involved them collecting internal and external data points such as subject matter, number of submissions, journal usage, funding patterns, and many more. The outcomes have helped improve existing journals, and suggest where future resources should be deployed for emerging markets.

Similarly, Blair Granville, insights analyst at Portland Press, demonstrated how his team tracked submissions, subscriptions, open access, citations, usage, commissions and click-through rates in order to to feed intelligence back to the editorial teams about where their focus should be.

 

Data and the law


The most enlightening paper of the day came from Sarah Day, data marketing professional and associate consultant at DQM-GRC, who spoke about data regulation and governance. She warned against complacency and ignorance when it comes to data, particularly with regard to the upcoming General Data Protection Regulation (GDPR). Already law, but due to become enforceable in May 2018 (allowing time for institutions to ensure compliance), this is an EU-wide revision of privacy laws designed to give individuals more control over their personal data.

‘In spite of Brexit, the UK – and indeed any country outside the EU that offers goods and services to people in the EU – will have to comply,’ said Day. The impact of the new regulations are far and wide as far as publishers are concerned, and among the most important things they can do is ‘be transparent about what you are doing with an individual’s data’.

Although Day successfully rose to the challenge of explaining GDPR in one minute, it served to demonstrate that managing data in a safe, secure, and legal manner is a complex issue that every publisher will have to address head on.

With more than 50 attendees at the event, drawn from publishers large and small, it’s clear that understanding data – and all the issues that come with it – is an issue that will only become more important in the years to come, as the amount of data generated grows exponentially.

For more blogs and publishing news from Warren Clark and the excellent team at Research Information please visit: https://www.researchinformation.info/analysis-opinion

Tuesday, 23 February 2016

Why is the business technology side of eJournals so unnecessarily complex? Tracy Gardner reflects...

Photograph of Tracy Gardner
eJournal technology is an essential part of the scholarly publishing industry. It is also the topic of one of our most popular training courses. Here, we spoke to Understanding eJournal Technology co-tutor, Tracy Gardner, about the challenges of keeping up-to-date in this area.

"One of the biggest challenges publishers face is making sure their content can be easily found in the various discovery resources readers use to find journal articles, and then to ensure the steps between the reader finding the content and reading it are seamless and without barrier. There are so many potential pitfalls along the way, and this issue therefore concerns people working in production, IT, editorial, sales, marketing and customer service.

The pace of change is fast, technology is evolving all of the time and the driver for much of it has come from the libraries. Libraries are keen to ensure their patrons find and access content they have selected and purchased and by keeping them in a library intermediated environment they feel they can improve their research experience overall. Ultimately the library would like the user to start at the library website, find content they can read and not be challenged along the way.

Simon Inger and I have been running the Understanding eJournal Technology course two or three times a year for ten years now and we have never run the same course twice - it constantly needs to be updated.

Those working in customer facing roles such as sales, marketing and customer service may not fully appreciate how much library technology impacts on the way researchers find and access their content. Many people are surprised to learn that poor usage within an institution is often because something has gone wrong with the way the content is indexed within the library discovery layer, how it is set up in the library link resolver, or issues with authentication.

For those in operational or technology roles, the business technology side of eJournals can seem unnecessarily complex and, especially for those new to the industry, the way the information community works can seem counter to the way many other business sectors operate. What makes sense in classic B2B or B2C environments will not make sense within the academic research community.

By helping people who work in publishing houses understand how the eJournal technology works and how they can most effectively work with libraries to maximise discovery and use of their content. Many people who have attended our course have not been aware of the impact some of their decisions have had and our course has helped them understand why they need to work in certain ways."

Tracy Gardner will tutor on Understanding eJournal Technology in March and October 2016. Book your place now.

Thursday, 22 January 2015

What is the Scholarly Book of the Future? Julia Mortimer from Policy Press reflects

We caught up with Julia Mortimer, member of ALPSP's Professional Development Committee and co-organiser of the The Scholarly Book of the Future seminar next month, to ask her what she thinks the scholarly book of the future will look like.

1. What have you seen happening to the scholarly book in the last few years?


The most significant change for the scholarly book has been the move to digital, with completely different and varying purchase models available leading to wide disruption in the marketplace.

Probably the most significant shift within this is the move from guaranteed sales of single copies via approval plans to patron or demand driven access (PDA/DDA) where books are only bought when users trigger a purchase or are loaned for short periods of time and minimal amounts of money. Usage has moved to centre stage, as it has been for journals for some time.

All this is happening as a response to shrinking and stagnating library budgets and some of the purchase models which were introduced at the outset into library aggregators' licensing agreements have been made use of in ways which weren't originally envisaged.

To date the content of digital monographs hasn't changed significantly, although some are getting shorter, they are essentially text-based versions of the print book. However they are being presented and sold in different ways via platforms which bring together a publishers' scholarly books in full or subject-based collections and can be sold via a range of models (more similarly to journal sales). Collaborations between publishers are also emerging in response to library feedback e.g. OUP's University Presses Scholarship Online which Policy Press is involved in and which provides a one-stop shop for University Press scholarly publications.

The rise of blogs and social media has had a considerable effect on academic research, as has the Impact agenda introduced in the Research Excellence Framework in the UK. At Policy Press we have introduced three strands of new fast-track short-form publications: research-based books providing the latest cutting-edge research findings, social commentary pieces and insights on topical issues or policy and practice guides enabling research to quickly have an influence. Academics have welcomed this flexibility which meets their changing needs and other publishers are also starting to offer a range of short formats.

Finally, open access is having a bearing but not to the extent that it has for journals so far. There are experiments and pilots taking place and publishers offer gold open access monograph options but funding is often an issue and other options need to be explored.

2. What has this meant for the financial side of publishing? 


The changes in digital monograph sales models have had a particularly detrimental effect over the past year on many academic publishers. Much lower revenue is generated as a result of the move in the US towards PDA and short term loans (STLs). In the past a certain number of monograph sales were pretty much guaranteed for high quality research outputs which ensured they were viable. Not any more. As Michael Zeoli of YBP Library Services said at a recent IPG seminar, now it is publishers taking all the risks without being able to get a return on their investment. (See The Bookseller article).

This is the case for not-for-profit University Presses and commercial publishers alike, as print sales continue to decline they are not being replaced by digital sales.

Gold open access payments for monographs are still very few and far between so it is early days in terms of a potential transition there.

3. Why is it important now to reflect on this, should we just let nature take its course? 


This disruption is having a fundamental impact on whether the scholarly book in its current form can continue. This is affecting publishers to such an extent that if we don't reflect now and take action some publishers may go out of business and the preservation of scholarly work will be under threat. 

Publishers need to be able make their case to academics and librarians about the value of what they do and the economics behind it. We also need to keep on top of the technology as it changes so that we can continue to innovate to meet the research community’s needs.

4. So, the book is dead, long live the book, right? Tell me what’s in your crystal ball… 


I think the monograph in long and short form will survive, certainly in the humanities and in the social sciences where a longer treatment of certain research findings is absolutely necessary. There is a strong case for embracing the evolution of scholarly publishing but it is not a case of one size fits all any more.

There will be a lot more convergence of formats, with chapters having their own DOIs and being increasingly included in joint databases and platforms with journal content. Similarly when searching for content researchers want to find everything relevant to their search in one place so discovery tools need to better integrate book and journal content, until the content itself is fully integrated. A greater use of XML and semantic web technology will allow researchers to use material in different ways eg integrated data, enhanced books with much more embedded multi-media.

Policy Press short format monograph
There will be a greater focus on what researchers need and studies are already under way to establish that. Innovative forms of dissemination will continue to develop: more short-form work, more use of social media and blogs, online communities and vehicles to allow interaction, engagement and collaboration with the work.

Sales-wise subscription and rental models will grow but there is likely to be a demarcation of premium new content and backlist archive in terms of what is offered via these models. For other types of content archives may be the premium product. Repurposing content and custom publishing options will continue to develop as digital formats and systems make this easier.

I expect to see more collaboration between university presses and their libraries as well - especially around OA models in the future - and other sources of revenue being sought to fund OA such as sponsorship or advertising.

So books are here to stay in my humble opinion, but they may look rather different!

Julia Mortimer is Assistant Director at Policy Press at the University of Bristol, a not-for-profit social science publisher committed to making a difference.

The Scholarly Book of the Future seminar, co-organised by Anthony Watkinson and Julia Mortimer will be held in London on Thursday 12 February. Follow #alpspbook for coverage.

Friday, 12 April 2013

Countdown to the London Book Fair: Academic Publishers: still open for business?

Monday 15 April, 11:30 – 12:30, Cromwell Room, Earls Court 1

Join ALPSP Chief Executive, Audrey McCulloch, as she chairs what should be a lively debate on open access with David Tempest from Elsevier, Mandy Hill from Oxford University Press and Richard Fisher from Cambridge University Press.

The panel will discuss reactions to the Finch Report on Open Access. Each publisher will share their views on the Report and how they plan to address the Finch recommendations in the short to medium term.

This is sure to be a popular event so make sure you arrive early to guarantee a seat.

Further information is available on the London Book Fair website.