Monday 24 February 2014

Transformation: the institutional perspective

Russell Burke, Royal Holloway
The first afternoon session at the Association of Subscription Agents conference provided the institution perspective on transformation. Russell Burke, Information Consultant at the Bedford Library, Royal Holloway University of London opened by outlining the two complex landscapes: for publishing and HE students.

Students need to easily navigate through information literacy landscape and the online landscape is more complex than print was. Even if students have good information literacy skills they still need to know what is changing and is new. They rely heavily on agents and publishers for information and consider whether to pass on direct or repackage via own support tools.

One of the good things about library search is the at-a-glance view of what they can get access to. They don't push it as a tool for researchers. But it can be used for a summary research survey. They also provide literacy training for all levels of users and use social media and other awareness tools to try and ensure whole range of students know what's available.

Where does open access fit in with this? With students, they focus on basic issues (do I have to use the library?) as bringing in OA too early could be problematic. To be fully information literate, users need to know the source of all references in their research and study. They need to understand if from a pre-print or published article. the challenge is to identify what users need to know and then ensuring they understand what htey need to know at right time.

Jill Emery, Portland State University
Jill Emery is Collections Librarian at Portland State University. They are a relatively young university, founded in the 1940s post-war, diverse in subject areas. They don't have the resource of an Ivy League institution. The commodification of HE, internationalisation and split purchasing between subscription/one-time purchases have impacted on them. They need to prove value of investment to those paying the bill (parents, endowments, etc).

Today's reality is focused on purchasing and the big deals remain. They are looking at PDA/Article purchasing and an 80/20 split between subscriptions/one-time. Staff attrition and open access are affecting them. So they are looking at new services.

Anything will be considered to lessen the student costs including DIY and stacks replaced by collaborative work areas. They are committed to local library publishing and are looking at monograph publishing and have developed 21st century collections which are highly curated, locally focused with the aim of global impact (e.g. Dark Horse comics, Films on Demand). They are trying to support local authors and make content out of the university available globally. They also try to supply as many resources as possible in mobile  environment.

Rob Johnson, Research Consulting
Rob Johnson, Director at Research Consulting, spoke about UK open access activity. While the UK is a big influencer within global research (and the government has understandably tried to take a lead with policy making in this area) out of 2.2 m globally research articles published per annum, it only publishes c.140k articles or 6.4% of the total. So what happens to the rest of global research output?

When reflecting on what intermediaries can do he suggested they can explore:
  • transaction management - including publisher pre-payments
  • improved author experience (but perhaps not yet?)
  • data, data, data (streamlining processes for managing compliance; promoting adoption of standard metadata forms and unique identifiers).
Chris Banks, Imperial College
Chris Banks, Director of Library Services from Imperial College London, outlined how they have a particular interest in open access due to the profile of their institution: 14k students of which 6k are postgraduate, c. 3k academics = lots of high level of research. She also noted that c. 92% of their budget is spent online.

The Finch Group findings 'at the time' were believed to be the best way to achieve a step change. The resulting push for Gold open access has created some interesting developments. Within the university, they have set up some interesting new working relationships. The research office is interested in library services as they have keen interest in compliance with funders and in some instituions they are managing funding of APCs.

Some library services are developing submission forms which seek to minimise the complexity for academics. The finance office are interested in the accountability for spend of Wellcome/BIS/RCUK/Institutional funds. There is a focus on raising awareness amongst academics and increasing understanding of the new RCUK mandates; information about journal compliance; copyright and licensing awareness; Green vs Gold open access, etc.

Banks finished by considering both old and new players in the aggregation industry. Is there another point where agent/aggregator could work with research information systems with the SHERPA data/ CRIS data nodes perhaps?

The Evolution of Subscription Industry 1970-2014: Subscription Agents and Consortia-New Roles and Opportunities

Dan Tonkery: a short history of subscription agents
Dan Tonkery, President and CEO of Content Strategies, an international information services consulting company working with STM publishers, kicked off the afternoon session at the ASA conference. In a previous life he has been a subscription agent and librarian and has an encyclopaedic memory of the history of subscription agents.

The agent years of milk and honey 1970-1985
The market was filled with multiple agents (such as Faxon, ESBSCO, Readmore, Blackwell, Majors, Swets, Harassowitz, Turner, Boley, McGregor, SMS and others). There was a low average selling price. Main frame computers were introduced to support processing. Agents dominated in the sense that you couldn't find a library around the world that wasn't using an agent.

Agents were essential to both libraries and publishers with 99% of libraries using agents. They become experts at processing individual orders and supporting each other. They built comprehensive title databases (c. 400k titles) and developed new reporting and analysis tools for librarians. They also developed interfaces with ILS vendors (so you could automatically load invoices and reports).

Years of mergers and rapid growth 1986-1996
This was the era when many of the smaller agents were acquired by larger agents. Faxon and EBSCO dominated the US market and Swets in Europe. Agents were building related services such as SC-10, ROSS, Microlinx, REMO, EBSCOnet. Gross margins continued to drop from 12.4% to 8.1% in 1999. Agents attempted to build new business systems, but Faxon collapsed on business system failure in 1994 and Dawson Plc bought Faxon in October of that year. In summary, subscription agents were growing and they were important to libraries and publishers. Bear in mind that hardly any of the publishers had a sales team at this time.

The golden age of library consortia 1996-2006
There were over 200 active consortia that were usually regionally (sometimes nationally) based. Publishers moved from print to electronic formats. A number of major consortia formed as resource-sharing agents serving their member libraries. Publishers turned to consortia as a new sales channel and direct deals were negotiated. Publishers began selling Big Deals or Custom Deals and began thinking of database deals instead of individual titles. You could argue that agents were caught flatfooted. Consortia examples from the US include:

  • NERL: 1996 28 core members and 80 affiliates. Bought $102 million in 2013, most of it handled direct.
  • GWLA: 1996 33 research libraries in Central and Westtern US, bought $37 million in 2012.
  • SCELC: 111 members and 120 affiliates. Bought $38 million in 2013.

This is close to $500 million business of subscription agents that now goes direct: a major problem for many companies.

Agents respond with new tools and services
In response, agents developed tools to help libraries manage publisher packages, expanded services and built knowledge and license databases to support A to Z services. However, consortia managed to capture market share from subscription agents: the market shrank from 98% in 1996 to less than 60% now.

Subscription Agent rebirth
Agents continue to evolve into new or expanded products and services. There is an opportunity for rebirth and growth through databases and distribution services amongst other areas. The most important factor to look for is having someone come from outside our industry and invent a new product service. Tonkery closed with a call to constantly look outside the box.

Publisher perspectives on transformation: panel at #asaconf14

Stephen Rhind-Tutt from Alexander Street Press
Stephen Rhind-Tutt from Alexander Street Press kicked off the first panel discussion by reflecting on how he struggles with a description of what he does.

He noted how it's amazing how many mission statements are similar (Google, British Library, ASA, etc). Nearly all of us help teachers teach, researchers research, students learn, and librarians serve communities. Alexander Street Press has streaming video and other digital products to stop media being a third class citizen in the library. Their mission means they have to be many things including streaming media provider, microfilm digitizer, photo library, web service, etc. 

Are publishers, agents, intermediaries all becoming one? It doesn't matter. We can deal with the naming later. What matters is having a clear mission that serves our customers, no matter where it takes us.

Eileen Welch, Director  of NEJM Group Licensing at the NEJM Group outlined how they are using social media - very successfully by the looks of it - to engage with their audiences. All their social media points to open access articles.

NEJM: geographic split of social media
Facebook is their fourth largest referrer of traffic to their site. The top 10 countries on Facebook and Twitter are divided fairly evenly across countries, not just dominated by the US. There were some pretty impressive stats from social media for NEJM and the insights it provides can sometimes be surprising: 75% of NEJM users are 34 years or younger. It gives them an opportunity to reach out to a younger group with 20 posts per week, videos, interactive short quizzes and original research.

They have also discovered the 'Ick Factor': posts with images - the more medically gruesome, the better - generate more referrals to NEJM.org and generate more comments and engagement.

The NEJM Twitter account has 169,000 followers. In January, it produced 135 tweets and ranked eighth as traffic source for the website. YouTube is of growing importance. They post animations, interviews and articles. Animations are summarised key findings of research article.



These shorts provide new ways to engage with their audience and allow the personality of the editor to shine through.

The Now@NEJM blog is produced by the NEJM publishing comms team and consists of two featured posts - insights and physicians in training. It alerts readers to new and innovative content and complements the content published in NEHM.

Robbetze's 'Content as patient' model
RonĂ© Robbetze from Springer Science + Business Media considered the idea of 'content as patient' with concerned 'parent/custodian - publisher/aggregator', 'institution as the physician', and 'usage data as the stethoscope'. Usage data can help you to test assumptions and insights with concrete evidence/indication of what people are looking/searching for.

Are the right services and tools there to help institutions interrogate and interpret growing amount of information? What can the role of agents and intermediaries be? There is a proliferation of vendors and providers out there from which statistics must be taken and processed and while there are some services in some countries out there, is there an opportunity for agents and intermediaries? One thing is for sure, it's getting messier and messier.

Greta Boonen from John Wiley & Sons finished the presentations by focusing on how intermediaries fit into the changing landscape.

Intermediaries: all shapes and sizes
We face challenges including the pace of change, changing needs, access and discoverability, and prioritising innovation. This presents a number of opportunities such as changing user needs, financial platforms, link workflows, industry standards and data exchange. This is the space in which intermediaries can craft solutions to match their strengths to solving these emerging issues. For the evolution of services, all stakeholders are part of the conversation about the future.

The e-volution of publishing: what's new, what's changing and what's staying the same

Youngsuk 'YS' Chi, Chairman, Elsevier
The keynote opening talk at the Association of Subscription Agents annual conference was from Youngsuk 'YS' Chi, Chairman of Elsevier. he reflected on what has changed, what continues to evolve and what stays the same in the publishing industry.

The role of academics in research and teaching is evolving. There are parallels with publishers who face increasing criticism about the value they provide. Many people perceive publishers as relics from the past.

With the pace, breadth and complexity of change, the community of publishers is becoming more inclusive, more flexible and yet more nebulous. Tech companies are increasingly called publishers.

Readers habits and expectations are changing. They read an ebook and expect it to be available on any tablet format anywhere in the world at the same price.

So what does this mean for publishers' new roles? They need to provide:

  • experiential content
  • social media and social networks
  • digital tools and solutions
  • big data
  • text and data mining.

Content-based experiences
The value of raw content will continue to decline with abundance. People will pay for 'content based experiences' (as compared to a football match where you pay more for a pitch-side seat for an immersive experience rather than in the 'nose bleed seats'.) With content this is happening in ways such as integrating note taking, multi-media, annotated links. All this turns content from static or dead into live and interactive.

Importance of social media and social networks
There are a number of social networks for our communities and not just Twitter (e.g. growing community of researchers on academia.edu/ResearchGate Scientific network, Mendeley, etc). These site users can connect and come together to share papers. There is a role for publishers to facilitiate this sort of interaction for their audiences.

Digital tools and solutions
Content will always be king. However, publishers will increasingly develop tools, solutions and experiences around content. Many publishers are now technology providers, particularly in STM where they are developing digital tools to help researchers research (e.g. Digital Science exploring workflow efficiencies in science research).

Big data
Publishers must use power of big data to their advantage - to understand consumer and viewer preferences then use this insight to help improve use of content in fresh ways and understand more about the communities they serve.

Text and data mining
Increasingly more publishers are opening up content to text and data mining. Examples include CrossRef's Prospect linking text to license service, PLS Clear, a digital clearing house to allow researchers to request rights from multiple publishers from one easy to use form, and CCC has service linking to repository of XML content. This will ensure a role for publishers moving forward.

Publishers are experimenting with agile publishing, subscription models, technology, business models. Experimentation is key. We need to fail, do it often, but do it early. Innovation always moves faster than adoption. Consider where will technology exist and pave the way for the future. Chi reflected on slower take up of ebook readers than expected - according to Digital Book World statistics, 20% of US adults have downloaded an ebook; 21% in UK - where there is still lots of print use.

Chi quoted Michael Mabe about scholarly publisher functions in a digital environment:


The value of publishing: is it mind the gap or fill the gap? There's a balancing act between innovation and improvement.

Friday 21 February 2014

Kurt Paulus on ALPSP International Conference 2013: Part 5 - How to keep up with the parallel sessions

David Smith from The IET
This is the fifth in a series of reflections on the 2013 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, save the date for the 2014 conference on 10-12 September at Park Inn Hotel Heathrow.

How to keep up with the parallel sessions

It is only when I come to write up the conference that I wish I was ubiquitous, able to listen to three sessions simultaneously or at least make better choices about which session to attend, as the geography of the venue made it difficult to flit from one to the other. Parallel sessions of course are useful in that they cram a great deal of diverse material into a compact time frame, and fortunately you can glean at least some of what you missed from the ALPSP website later.

The ‘Publishing practicalities’ session for example, chaired by David Smith of The IET, looked at Creative Commons BY licenses, so central to the discussion of developing Open Access. It explained the reasoning behind PeerJ, an open access journal based on a ‘lifetime membership’ rather than APC model that sees itself as a ‘technology first’ publisher, opting for outsourcing to ‘the cloud’ from the outset rather than starting off by hosting and maintaining the technical infrastructure. Finally the session gave space to the use of Google Analytics, described as the most popular and widely used web traffic analytical tool to help make scientific, data driven decisions on the development of one’s website.

Alan Hyndman fro Digital Science
“Source: Wikipedia, so it must be true!” Alan Hyndman

The familiar ‘Industry updates’ session chaired by Toni Tracy gave opportunities to learn about the Copyright Hub, designed to help overcome some of the difficulties experienced in copyright licensing. ‘Force11: The future of research communication and e-scholarship’ is described as a community of scholars, librarians, archivists, publishers and research funders that has arisen organically to help facilitate the change toward improved knowledge creation and sharing; perhaps we can look forward to its being lifted out of the relative obscurity of parallel into one of the plenary sessions.

In the same update session, Heather Staines of SIPX talked about ‘The MOOC craze: what’s in it for publishers?’ MOOCs are massive open online courses aimed at large-scale participation and open (free) access via the internet. Those publishers interested in the freemium approach might be open to these opportunities. Finally Steve Pettifer of University of Manchester told how he “stopped worrying and learned to love the PDF”.

"While licensing content for use in [MOOC] courses challenges every existing model, there is a place for your content, whether it is OA, subscription or ownership based” Heather Ruland Staines

Fiona Murphy from Wiley
Apart from the session on accessibility, of which more in the final post to follow, I did seek out the parallel session on data chaired by Fiona Murphy of Wiley. Access to the data underlying reported research assists verifiability and reproducibility, and can help advance scholarly progress through evaluation and data mining. Questions arise such as which data, e.g. raw, reduced or structural as in crystallography (Simon Hodson).

To be fit for re-use or development, data must be discoverable, openly accessible, safe and useful (Kerstin Lehnert). There is a need for data provenance and standards for incorporation into metadata, and stewardship of data repositories. Steps towards consolidating such needs include the DRYAD repository, a nonprofit membership organization that makes the data underlying scientific publications discoverable, freely reusable and citable, and IEDA, or Integrated Earth Data Applications, a community-based data facility funded by the US National Science Foundation to support, sustain and advance the geosciences by providing data services for observational solid earth data from the ocean, earth and polar sciences.

“Hey, don’t worry; don’t be afraid, ever, because it’s just a ride” Bill Hicks

A somewhat contrary view was provided by Anthony Brookes of Leicester University, who suggested not the sharing of data but the exploitation of knowledge. In biomedical, clinical, genetic and similar research areas there are privacy and ethical barriers to unfiltered sharing and access. That does not undermine the idea of sharing ‘data’ at various levels, and indeed the more abstracted data that can be shared under such circumstances might be richer, and fuller of ‘knowledge’. He foresaw a hierarchy where ‘safe data’ can be openly shared, ‘mildly risky’ data are accessible in an automated, ID-assisted fashion and personal data for which there is managed or no access. A prototype for this approach is CafĂ© Variome which seeks to provide the framework for such access/sharing management.

The discussion following this session suggested that there is room at future conferences for the wider issues to be debated: value added by linking across datasets, knowledge engineering from datasets demanding all the metadata and all the provenance, publishing models that facilitate all this and the role of scientists, editorial boards and learned societies in defining the issues of data quality, description, metadata, identifiers, seen as matters of some urgency.

Rapporteur
Kurt Paulus, Bradford-on-Avon

Wednesday 12 February 2014

Kurt Paulus on ALPSP International Conference 2013: Part 4 - Communication - why, what and how?

Audrey McCulloch and the 'Was it something we said?' panel
This is the fourth in a series of reflections on the 2013 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, save the date for the 2014 conference on 10-12 September at Park Inn Hotel Heathrow.


Communication: why, what and how?

Eric Merkel-Sobotta’s plea for publishers to explain themselves, to themselves and others, is not new but becomes more urgent if there is a risk that the initiative might slip out of publishers’ hands. It therefore made sense to devote a whole session to the topic, chaired by Linda Dylla of the American Institute of Physics with Grace Baynes of Nature and Helen Bray of Wiley as speakers, who addressed more the how than the why. Clearly there are many things to be communicated: the rationale of scholarly publishing, the brand of a single publisher, the nature and benefits of a particular project.

Helen Bray from Wiley

“If the rate of change on the outside exceeds that on the inside, the end is near” 
Helen Bray

Communication is used to manage change, and the more rapid the change the more effective the communication must be. The modes of getting the message across – e-mail, press release, publisher blogs, conference presentations, social media, direct dialogue – all have their place provided they convey clear uncomplicated messages that sound convincing wherever they come from within the organization. That means, for example:

Making all employees the company’s spokespeople. Thinking communication from the start. Finding respected external advocates. Knowing your audience and learning their language. Being part of the conversation and listening. Keeping the message simple and saying it again and again.

The discussion after the presentations revealed an unease about our ability as communicators: it should not be that complicated to explain yourself but we appear not to have been too successful in doing so, nor in creating a publisher-wide consensus that can form the basis of effective lobbying.

“Homework: try to explain what publishing is, to your mother or a taxi driver” Audrey McCulloch, ALPSP Chief Executive

What is the publisher now? panel
Part of the difficulty of communicating messages about publishing internally or externally is that every time you turn your head, publishing has changed. Some change agents, such as funders and governments, have become more proactive and in response some, such as learned societies have had to up their game.

The technologies we use to publish have changed almost out of recognition and continue to evolve rapidly. The parallel session ‘What is the publisher now?’ chaired by Jane Tappuni of Publishing Technology addressed some of these issues, with the key focus on the role of publishers and how they can stay relevant: should publishers become IT providers and, if not, how should they partner with technology companies to drive the publishing process most effectively?

Interactive discussions on publishing skills
The role that the publisher decides to adopt must be communicated and absorbed throughout the organization. The choice also has implications for skill development and training, discussed in a further parallel session on ‘Publishing skills: the changing landscape’ chaired by Margie Jarvis of OUP.

This interactive session looked at changes to the way we work, the core skills we need to retain and the new ones we need to foster and the opportunities this represents.

Further details on these sessions are available on the ALPSP YouTube channel.

Rapporteur
Kurt Paulus, Bradford-on-Avon 

Monday 10 February 2014

Kurt Paulus on ALPSP International Conference 2013: Part 3 - State of play for journals open access

Fred Dylla from the American Institute of Physics
This is the third in a series of reflections on the 2013 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, make sure you save the date for this year's conference on 10-12 September 2014.

State of play for journals open access

So you thought journals open access was all sorted? Not if you attended the session on negotiating with governments chaired by Fred Dylla of AIP. Fred has been closely involved in negotiations about open access models in the USA, Steve Hall of IOPP, as a member of the Finch working group is similarly placed in the UK and Eric Merkel-Sobotta of Springer filled in the picture for the European Union. The aspiration is familiar: everyone wants research results to reach the widest possible audience, and even increasingly acknowledges that this wish has to be paid for in a viable way. The contention is over the How?

In quick succession in the UK in mid 2012, the Finch Report recommended Gold open access as the preferred long-term option, agreed by all stakeholders with Green as the route to this destination. It also made recommendations about funding mechanisms, ways to increase access to the 96% of research published overseas, and experimentation on open access to monographs. The government accepted the report in principle, with Gold as the aim, but no extra money. Job done? Not so fast. Research Councils UK initially, though it is said with inadequate consultation, supported Gold and payment of APCs, but had to retreat, being out of step with what appeared to be happening in other countries, and was criticised by Parliament’s Business, Innovation and Skills Committee, though the latter did not escape criticism itself.

“Throwing things against the wall and hoping you’ll be able to clean up the mess later on seems a poor substitute for evidence-based reasoning” David Crotty, Scholarly Kitchen

The Higher Education Funding Council for England is consulting and appears to be veering towards Green. University policies are still evolving and there is no consistency within the Russell Group of universities, with Gold being favoured by only a very small minority. Most publishers are offering Gold as an option but a pragmatic approach seems the order of the day.

“Status of implementation in UK: Green is the new Gold” Steve Hall

Steve Hall from Institute of Physics Publishing
Let’s go to Brussels then: the Commission’s Horizon 2020 aims to optimize the impact of publicly funded scientific research on economic growth, better and more efficient science and improved transparency, with open access as the general principle, and a mix of Gold and Green: all principles but no practical implementation so far.

In Germany an initial 18-month consultation came out for Gold but an alliance of small publishers, Börsenverein (organizer of Frankfurt Book Fair) and large funders scuppered the initiative. There has been some progress in other countries but it has been difficult to approach the momentum achieved in the UK and USA.

There is some urgency at the European Union level as there will be a new Commission within about a year, and the work may have to start all over again if no solid consensus emerges before then. Eric Merkel-Sobotta urged all his listeners and their associations – ALPSP, STM and others – to build up much more of a presence in Brussels and articulate a coherent argument for the place of publishers in the value-added chain, to defeat the still current clichĂ© of the greedy, rip-off publisher.

“Continue to engage constructively in the debate and increase the volume” 
Eric Merkel-Sobotta

By now the atmosphere in the great marquee was perhaps a little subdued: here we are all ready for new business models for journal publishing, but why is it so difficult? Fred Dylla’s review of the US experience was perhaps a little more positive. There is a clear policy on the part of the Office of Science and Technology Policy for increasing access to the results of federally funded research. Funding agencies have been asked to come up with proposals for achieving this, due about now. Most agencies have not yet publicly responded though the National Institutes for Health are ahead of the game with the offer to open up PubMedCentral to other agencies.

Eric Merkel-Sobotta from Springer
About 70 publishers together with CrossRef have offered the option of CHORUS, a multi-agency, multi-publisher portal and information bridge that identifies articles and provides access, enhances search capabilities and long-term preservation, with no cost to the funding agencies. The universities have offered SHARE, an approach scaled up from existing repositories. This offers potential for collaboration with CHORUS.

Fluid is perhaps the best word to describe the state of play in respect of public access policy, with a fairly systematic approach in the USA, some hope in the UK and head scratching in the rest of the EU. Expect another session at this conference a year from now. Meanwhile, keep up to date with posts on the Scholarly Kitchen and elsewhere.

Rapporteur
Kurt Paulus, Bradford-on-Avon

Thursday 6 February 2014

Kurt Paulus on ALPSP International Conference 2013: Part 2 - So what about books?

The Belfry, location of the ALPSP 2013 conference
This is the second in a series of reflections on the 2013 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite,save the date for the 2014 conference.

So what about books?

Debates during the last couple of decades have been driven largely by journals and journal-related innovations, with books seeming more like an afterthought at times. They are of course a core component of scholarly publishing, especially in the humanities and social sciences, and it seemed this year that thinking and experimenting about them has shifted more to centre stage. Not only have e-books firmly arrived but so has exploration of open access for books, about a decade after journal publishers first started worrying about it.

After Huw Alexander of Sage entertainingly showed us that the science fiction writers were way ahead of us in their thinking – of course they don’t need to slavishly follow business models – he led us through some of the uncharted territory. The threats of piracy, Amazon, open access are there but we are learning quickly about platforms, pricing models, advertising and mixed media, though we lack standards for sales data that should inform our thinking. However, the ‘age of convergence’ is upon us: devices will align, formats will standardize and new approaches, e.g. selling through content hubs, will emerge.

“The future is already here, it’s just not evenly distributed” William Gibson

Despite the ‘terrorism of short termism’ – what to do on Monday, the pressure of the bottom line – some signposts are becoming clearer. Is one sold copy preferable to 10 usages, is ownership preferable to access, is there mileage in subscription or usage based models? What about the partners of the future, not just our current peers but Amazon, Google or even coffee shops as digital outlets (look around you next time you pop out for a cuppa).

What is a book, anyway, asked Hazel Newton of Palgrave Macmillan? The current terminologies were coined in the age when print technologies were dominant. Digital content does not discriminate by number of pages or screens or total length, especially when memory is cheap. It is also far less limited by the time constraints that print technologies impose in the form of publication delays and it allows publishers to stay ahead of the game in rapidly moving fields.

“Constantly question why things are the way they are” Hazel Newton (NB: some quotations are paraphrased though close to the original!)

Breaking the rules, Palgrave’s Pivot series positions itself squarely between the journal article and the full-scale monograph. It's publishing within 12 weeks of acceptance and offers itself as digital collections for libraries, individual ebooks for personal use or digitally-produced print editions. Despite the perceived conservatism of academia, Pivot has so far published more than 100 titles; Hazel considers HEFCE now to be more flexible in what formats it will accept as evidence for the Research Excellence Framework.

This way for open access

But open access for books?

I thought you’d never ask; well Caren Milloy, head of projects at JISC Collections, is questioning editors, sales and marketing and systems people in about 10 humanities and social science publishers about their views and concerns over open access publishing.

The OAPEN-UK project is still under way and it is clear that a lot of internal corporate education will be necessary, all current processes will need to be reviewed, publisher project teams need to start work now and involve all parts of the business. Don’t wait for standards to be developed but think about them now, don’t assume OA for books will follow the journal model and develop a clear idea of what success would look like.

“Open access is here: the need is to invent and develop sustainable business models” Catherine Candea

Three speakers in a session on ‘Making open pay’ chaired by Catherine Candea of OECD gave three different approaches designed primarily for books in the social sciences and humanities. Frances Pinter, founder of Knowledge Unlatched, had no doubt that open access business models will be prominent for books albeit it will take time for these to take root. The model for Knowledge Unlatched is one of upfront funding of origination costs complemented by income from usage, licensing, mandates, value-added services and other options yet to emerge.

Unlike the Author Processing Charge (APC) of the journal Gold OA model, the fixed cost in Knowledge Unlatched would be covered by title fees paid by members of a consortium of libraries, thus ‘unlatching’ publication of titles by members of a publisher consortium in a Creative Commons licensed PDF version. Publishers will then be free to exploit other versions of the title, or subsidiary rights for profit. Knowledge Unlatched provides the link between the library and publisher consortia. A pilot is about to be launched, with 17 publishers so far taking part and a target of 200 libraries to ensure that the title fee is capped at $1,800 per library.

“It’s a numbers game, so look at the margins: lots of little contributions, not just one big one” Pierre Mounier

‘Freemium’ is the model for the platform Open Edition Books outlined by its associate director Pierre Mounier. The platform is run by the Centre for Open Electronic Publishing, Paris and financed by the French national research agency in partnership with, so far, some 30 international publishers. Books are published open access in HTML, but value-added premium services are charged for. These may include other versions such as PDF or ePub, data supplies, dashboard and so on, licensed to libraries. The mix between free and premium may change as library needs change; the most important thing is to keep in touch with the libraries to understand their changing requirements. So far 800 books are included and there is an ambitious annual launch programme. Currently over 60 libraries are subscribers. One-third of revenues goes to the platform and two-thirds to the publisher.

It's a book, but not as you know it.
Also Gold OA in concept, but in a different context, is the publishing of the Nordic Council of Ministers described by Niels Stern. The Council’s publishing model is already OA in the sense that the Council commissions research and is then invoiced for publishing services. That income, however, is not secured for eternity and may be subject to political constraints, so Niels and his colleagues went through a classical business analysis.

They concluded that digital distribution provides most opportunities for change. It also has the potential to offer most value to its customers - politicians, researchers and government officials - ensuring the impact of public money, visibility through flexible access and accountability for money spent.. Open Access was the key to unlocking these benefits, ensuring future loyalty from the customer base and hence future revenue streams.

The conclusions from their approach will be familiar from different contexts:
  • Keep an open mind: stop copying previous behaviours. 
  • Revisit your arenas constantly. 
  • Zoom in on your target audiences and find new needs by listening. 
  • But don’t cogitate forever; take the courage to act!
The model is true Gold. It pays because it is building an organizational asset, with your customers solidly behind you.

Rapporteur
Kurt Paulus, Bradford-on-Avon

Tuesday 4 February 2014

Access to Research pilot launched

Minister for Universities & Science,
David Willetts, addresses the audience
Last night saw the launch of the Access to Research pilot at The Library at Deptford Lounge in Lewisham, South London. The pilot, a two year project in the UK to provide free access to research via computers in public libraries, was launched by the Publishers Licensing Society with guest speaker, the Rt Hon David Willetts, Minister of State for Universities and Science.

The two year pilot has over 1.5 million articles from 8,400 scholarly and academic journals available in 79 local authority libraries. The initiative is supported by trade bodies the Publishers Association and the Association for Learned and Professional Society Publishers, as well as the Society of Chief Librarians and technical partner ProQuest.

Janene Cox, President of the Society
of Chief Librarians
The project will allow users to search and read scholarly research articles while in the library. It is anticipated it will be of particular relevance to small business, students and special interests, where the person doesn't have access to an institutional library.

Libraries and publishers are being encouraged to sign up to boost the number of articles that included and to increase the number of locations where the content can be accessed.

'The government believes in open access, but understands there is a cost to publication.' David Willetts


David Willetts was joined by PLS Chief Executive Sarah Faulder, President, Society of Chief Librarians, Richard Mollet, Chief Executive of the Publishers Association and Phill Hall, project contact at technical partner ProQuest.

Sarah Faulder, PLS Chief Executive
'This is an important initiative and working across organisations in a partnership effort has involved compromise and risks to make this pilot launch.' Janene Cox

ALPSP is delighted to support the project through promoting participation to our members as well as access to our journal Learned Publishing. Further information about the initiative is available on the Access to Research microsite.

News coverage to date includes articles on the BBC, The Bookseller, PR Newswire,



 


Monday 3 February 2014

Managing the open access data deluge without going grey


Cameron Neylon: the OA data deluge
The final two sessions at ALPSP's Data, the universe and everything seminar reflected on the changing nature of data within an open access context and what needs to be taken into account when trying to cope with data.

Cameron Neylon, Advocacy Director at PLOS, counselled delegates on 'Managing a (Different) Data Deluge'. Publishing is now a different business. Customer may look the same, but they act different and you have to think differently. Data is core to the value you give.

There's no sign of the growth trajectory of open access publishers slowing down. PLOS One on its own is 11% of the funded research papers output from the Wellcome Trust. PLOS One is 5% of the biomedical literature. PLOS One Publishes on average 100 papers per day. All the metadata they have comes from the authors and they don’t necessarily have accurate data on who they are or where they are based, so it gets complicated. This is happening on a large scale across scholarly communication services.

Neylon believes that the business of open access publishing is fundamentally different to subscription publishing. With a traditional subscription business you have a pool of researchers and institutions. Advertising and reprints come from third parties. This is a distribution model and not so much about where the research has come from.

With an APC-funded open access business it is a service or push model. The customer is the author at some level. Increasingly (in UK for example) this is coming through the funder. This means that suddenly all these players have an interest) which they didn’t have before). A third model is the funders directly funding infrastructure (e.g. eLife, PDB, Genbank etc).

The customer = institution, the author, the funder. They have questions about how much? How many articles have you published? What's the quality of service? Are there compliance guarantees (this is relatively simple in the UK, but tricky in North America or the EU). They want repository deposit. And all this has to happen at scale. You need to track who funded the research. This means that the market is being commoditized. It also means that the market is smaller, with space to make profit smaller.

Neylon feels that if we do not do this collectively, the whole system will collapse and we’ll be left with one or two big players. Using identifiers, capturing data up front and making it easy for the author to include the correct data up front are key to tackling the issue of the data deluge we face. If we don’t will have lost the opportunities. It’s about shared data identifiers and making them at the core of your systems.

He reflected on the particular challenge that smaller publishers face if they are to survive. They need to share infrastructure across multiple organisations. ALPSP is well placed to support and advise suppliers that smaller publishers need ORCID and FundRef etc up front.

Ann Lawson, Senior Director of Publisher Relations and EBSCOAdvantage Europe, focused on the various challenges for managing open access data without going grey. EBSCO see the impact of data from their own perspective (with 27 million articles in the EBSCO database products) and also from the perspectives of their client publishers and institutions. They have their own ID systems, but also input any partner or publisher IDs which results in 485 data elements per subscription record.

Ann Lawson: trying not to go grey
In a recent research report drawn from their own data, they've noted that large publishers are getting larger: in 1994 the top 10 publishers were responsible for 19% by value. In 2009, the top 10 publishers represented 50% by value. And in 2013, the top 10 publishers accounted for 68% by value.

In the immediate future, EBSCO see a mixed market of Gold, Green and Subscriptions within scholarly communications. However, there will be an impact on transactions from individual journals, to big deals, to small gold open access APCs. The impact on subscription agents is challenging as they have to keep on doing what they do, plus play in the open access area. There is a challenge of scale and transparency for everyone.

What will these market trends mean for data? There is a new cycle for open access which impacts on the need for data. This includes measures of value for money, speed to publication, reach and impact, reporting, funding sources, and the approval process.

There are data issues for the institution: who are active authors? What funding sources are available? Which funders demand what compliance? Which journals are compliant? What happens at school/research group? How much does the APC cost? Who paid what, with what effect? What reporting is needed for whom? Compliance – and deposit in repositories.

The institution workflow is at the heart of the data flow:

  • Policies
  • Advocacy 
  • OA request form 
  • Acceptance email 
  • Funding pot 
  • Copy of invoice 
  • Article and DOI 
  • CC licence 
  • VAT 
  • Approvals 
  • Records 
  • Reporting and analysis.

The reality is that many publisher systems do not have the ability to adapt their systems. Current points of tension include: money management, complex workflows, and author involvement. Discovery is key, but can be tricky with hybrid journals so discovery at article level is essential. NISO is helping, but there is more work to be done in this and many other areas of data.