Thursday, 28 February 2013

Steve Pettifer's break-up letter: Dear Publisher...

Pettifer: with a break-up letter to the journal
Steve Pettifer, from the School of Computer Science at the University of Manchester, offered a break-up letter to the journal for the audience of Ignore Your Authors at Your Peril seminar today. Pettifer believes that there's a simple equation for his career:

Ability to do science = funding x cleverness

Likelihood of funding =
(previous ideas x prestige) + new idea

Is it about ego? Possibly. This is where scholarly communications come in. As an academic and researcher he has to convince funders that he is a good person to take an idea forward that will help change the world. When he started as a researcher, it was the end of the old style of publishing, a time when it felt weird when things started to go online.

The challenge now is that researchers operate in a landscape of goodness or importance and managing the transition from one to another without dropping your research status is tough. Every now and again, something comes along that changes the environment, that enables them to get to their goal. Open access might just be that disruptive force (for publishers) that works (for researchers).

So what are the things that publishers tell him that they do for him?
  • peer review
  • copyediting
  • typesetting
  • visibility
  • legal support.
Yet for each of these valuable services, there are poor examples that undermine the statement. A recent study into peer review demonstrated that only one third agreed on the science with one third signal to two thirds noise. That is itself not a bad thing, but it's not compelling evidence of the value and efficacy of peer review for the scientific record. (Source: Peer Review: Welch, Ivo. 12 October 2012 Referree Recommendations, Social Science Research)

Pettifer quoted Kent Anderson from a recent post on Scholarly Kitchen that asked how science can maintain integrity if papers are buried in 'mega journals' and don't have full peer review? In reality, Pettifer says, he has never cited a paper because of its peer reviewed status. He then went on to mention Alan Singleton in a recent Learned Publishing article. In it, Singleton had paraphrased feedback during peer review, as the essence of its definition. It was something along the lines of: 'the methods are OK, I don't agree with the result, so it should be published for others to repeat or test the experiment.'

Pettifer went on to discuss copyediting: it's not always great, there are numerous examples available of poor grammar in headings from great and good publications. There are some fundamental errors that are made in typesetting, for example, the confusion of German ß rather than β. These things stand out and undermine claims of a valuable, quality service to authors.

He asked publishers to be clear about the relationship between them both. Understand that sometimes it is almost impossible for authors to comply with even the most basic requirements (with ebooks or epapers, it is impossible to comply with making only one copy, as soon as you download, your devices will sync).

He asked publishers to embrace new things. He called for them to move on from the journal impact factor. Many think it is rubbish. And there are lots of other exciting things coming along that are great measures for a researcher including: Impact Story, DataONE, figshare, Altmetric, ORCID, using DOI, and Dryad, that aim to make papers more readable.

He also stated that the object of record is for machines. While he is nerdy, and it matters a lot to him, it is not for humans. Make it available for machines to reference and search.

Pettifer wants publishers to recognise he is an individual and author, reader, editor and reviewer. Help him communicate his work, and help him to make his work better. He believes that BioMedCentral has not moved on much; PLOS is still quite traditional; eLife is too, but is aiming for higher quality. PeerJ is exciting, and arXiv - while just a way to put pre-published papers online - works for physicists. PeerJ and eLife have set the bar high in technology (readability and downloadable objects) with a combination of good technology and presentation.

What do all publishers need to do? Make publishing a service, not an ordeal.

Authors: The Publisher's Perspective by Timo Hannay


Hannay: authors have more economic power now
Timo Hannay, Managing Director at Digital Science believes the relationship with authors is a complicated issue with numerous opinions. His view? The economic power is shifting to the author, having historically been with the reader, and publishers should be in the business of getting the right information in the hands of researchers and authors.

What is a publisher? Hannay doesn't feel like one. He sees himself as scientist who is passionate about technology, who runs a software company within a publishing business. It's interesting to note his company has a portfolio of technology companies that are run like start-ups. This is unusual for publishing.

One of the major purposes of journals is to help researchers learn about discoveries. So the nature of the (traditional) relationship has been between readers and institutions with:
  • a reader service
  • highly filtered content, but inefficient (high rejection rate and editorial input, peer review adds a lot of value, but takes a lot of time and resource for all involved)
  • the reader has little economic power - beyond subscription, very few ways of getting hold of the content other than asking author for copy of paper.
Open access delivers the concept of author as a customer for publishing discoveries. PLOS, BioMedCentral and Scientific Reports are examples of organisations that have pioneered this approach. They have much more of an author service and recognise the need of the author to publish, rather than know what is in the fields. They use lighter peer review (which is still inefficient) and the author has more economic power. For Hannay: it is about supply and demand: an increase in the demand to publish, and a decrease in the demand to read.

However, the author experience sucks. Publishers provide a clunky interface, opaque processes, are slow, slow, slow, and create Sisyphean experience for their authors. The publishing industry and the people in it, are not without innovation, but there is not enough of it and it tends to come from large, established players.

We are starting to see new entrants who display different relationships. Examples of new start-ups where innovation comes from outside the industry include:

There are two perspectives to take into account: publishing discoveries (author relationship) and learning about discoveries (reader relationship).

For authors its about career path and development of reputation. Journal publishing isn't everything to this. Metrics also help. Getting information isn't everything, it's about exposing your institution and peers about what you are doing. The different stages are:
  • Gaining a reputation
  • Finding collaborators
  • Finding a job
  • Obtaining funds
  • Planning experiments
  • Learning about discoveries
One of the challenges in the past is that the act of publication is the only thing you can measure. However, companies are now providing tools to show what you are doing. When developing tools online, they are more accurate and can be measured so credit can be given where due. Examples include: AltmetricSymplectic, and Thomson Reuters. Finding collaborators has also changed with projects such as Research Gate and Frontiers facilitating this.

For the researcher, the cycle is:
  • learning about discoveries
  • planning experiments
  • conducting experiments
  • evaluating results
  • sharing results
  • publishing discoveries
Publishers should be in the business of putting the right information in the hands of researchers and authors. In a networked digital world, much more that can be done if we think about it in the right way. Publishers also need to get skilled up in the right way. If researchers are your main market there is so much more that can be done.

Publishers are in fear of Google, Apple and Amazon and lump them all together. They are all very different business: one is a retailer, one is an advertising network and one is a hardware company. Their common success factor is that they are amazing at technology. There is a direct correlation to mastering technology and success.

Publishers should own their technical development for their markets.

Tuesday, 26 February 2013

ASA 2013: Lorraine Estelle on Be Careful What You Wish For

Lorraine Estelle address the ASA conference audience
Lorraine Estelle, CEO of JISC Collections gave us a clear reminder: this isn't about quality, this is about cost. We have to consider potential unintended consequences of new models so we don't end up recalling the 'golden age of the big deal.'

In the beginning, we had the 'big deal' and this is still the predominant model for journals. It is not without its critics, notably, a number of mild-mannered people are very critical of it. The big deal causes so much inflexibility in library budgets it impacts particularly on arts and humanities for collections.

The major problem with the big deal is the underlying pricing model. It was based on print concept and subscribed and non-subscribed model. When introduced, university libraries were required to to maintain payments to print journal subscriptions and pay an extra charge for e-access to gain access to all the non-subscribed titles. It is hard to believe that the base cost of subscribed journal perpetuates in many big deals today - almost 20 years later. This forced maintenance of the base cost (historic print spend) is what makes the big deal so inflexible. To compound the issue, she has yet to find a publisher that can provide the metadata that upholds the value.

What are the alternatives?
  1. value based pricing: you pay for what you use
  2. gold open access: you pay for what you publish (at article level)
Value based pricing is a new digital pricing model directly linked to usage. This model is supposed to enable the move away from the historic print spend. Estelle cites the American Chemical Society who have had a good go at implementing value based pricing. They show a price-per-article on their website of 26 cent compared to $3-4 from Elsevier, Wiley and other commercial publishers. But what happens on an institutional basis when implemented? The Benedictine University (Source: Inside Higher Ed 2011) reported a whopping 1,816 percent price jump for 2011 due to increased usage.

While they probably had a really good deal to begin with, this highlights the real problem of winners and losers: the more an institution reads the more it pays. In essence, the best customers are the ones that will have to pay much more.

With gold open access, it's a model where you pay for what you publish. It avoids the difficulties of top slicing (where librarians aren't involved in purchasing decisions). In June 2012, the Finch Group estimated that the additional cost per annum for the UK to move to gold open access is £38 million per year. In April 2013, RCUK is introducing block payments to pay for APCs for universities and eligible research institutions.

There is an interesting dilemma about winners and losers. UK institutions will be expected to pay for the processing charge so that papers by authors in their instituions are freely available around the world. However, those same institutions will still be required (for the time being) to pay for subscriptions to papers published by authors in the rest of the world (c. 94% of all other articles). Funds for APCs are most likely to come from existing research budgets - not library budgets that will need to be maintained at current levels.

Essentially, the more the institutions within the UK publishes, the more it pays! They are looking for recognition from UK publishers that there is a need to consider the issue at a local (UK) or instituional level. It is certainly not sustainable in the future.

The unintended consequences and part of the problem is that an increase in article downloads is associated with an increase in articles authored. This is associated with increases in PHDs granted and an increase in grants won. Value based pricing and gold open access are both models directly linked to usage. To control costs an institution may need to control use by:
  • restricting the number of articles downloaded
  • restricting the  number of articles published.
The argument that those research intensive universities can afford it no longer stands up to scrutiny. University capital budgets have been impacted by cuts and this impacts on what is available for research. Estelle closed with a stark statistics: the total BIS grant for 2012/2013 is £5,311 million, compared to £6,507 million for 2011.12 - an 18.4% cut.

Be careful what you wish for: the 'big deal' may be remembered as the golden age.

ASA 2013: Laura Cox on Pulling Together - Information Flow Throughout the Scholarly Supply Chain

Laura Cox with a messy and complex supply chain
Laura Cox, Chief Marketing Officer at Ringgold Inc, talked through the problems of information flow throughout the scholarly supply chain. If only publishers would use the right identifiers with their content, then there is a huge opportunity to improve information, insight and cost efficiencies.

What are the things that go wrong? Records are unconnected through the supply chain. Links fail between entities, between internal systems, and between external systems. Renewals are mishandled. Journal transfers, access and authentication is mishandled. Authors and individuals are not linked to their institution. Open access fees have to be checked manually. Authors are not linked to their research and funders are not linked to the research they fund.

We need to find a path to using standardized data. Identifiers can help. They can provide a proper understanding of customers, whether author, reader or institution. They also provide a simple basis for wider data governance (that is data governance defined as processes, policies, standards, organization, technology required to organize, maintain and access/interpret data) through:

  • ongoing data maintenance 
  • identifiers enforce uniqueness
  • enable ongoing data governance
  • ensure systems work
  • help with cleansing data for future use.

Cox cited research from Mark Ware and Michael Mabe (The STM Report, 2012) for the wider context:

  • Journals are increasing by 3.5% per annum 
  • There is an increase in the number of articles by 3% per annum
  • The number of researchers is increasing by 3% per annum
  • Growth in China is in double digits
  • There is increasing demand for any time, any where access
  • Library budgets are frozen.

There are a number of identifiers available. For people, there is the International Standard Name Identifier (ISNI) which can apply to authors, playwrights, artists - any creator - which is a bridge identifier. The Open Research and Contributor ID (ORCID) links research to their authors. It disambiguates names looking at the different manner in which they can be recorded and can help remove problems with name changes. They can embed IDs into research workflows and the supply chain, can link to altmetrics and to integrate systems.

Institutional identifiers include Ringgold and ISNI, which map institutions and link data together. This ensures you can identify institutional customers so you can give correct content, and it disambiguates institutional naming conventions.

If you put institutional and author IDs together you gain genuine market intelligence:

  • who's working with whom and where
  • impact of research output on a particular institution - the contribution of their faculty
  • subscription sales or lack thereof
  • where reseach funding is concentrated
  • ability to track open access charges (APCs) to fee structure.

Use internal linking in your systems, you can use identifiers to connect:

  • customer master file
  • financial system
  • CRM/sales database
  • authentication system
  • fulfilment
  • usage statistics
  • submissions system
  • author information.

This enables you to access information from multiple systems in one place, reducing time and cost in locating information, and enabling you to use information to make decisions and inform strategy.

A nice and tidy supply chain
External linking using identifiers will enable you to:

  • ensure accuracy of information
  • speed up data transactions
  • reduce queries
  • reduce costs
  • open data up to new uses
  • provide seamless supply chain where data flows from one org to next
  • ensures that authors receive credit for the work they produce
  • provide a good service to the community.

We need a forum to discuss and pull together: to engage with the problems in data transfer, generate an industry wide policy on using identifiers, break down the data silo mentality, and use universal identifiers to enable our systems to communicate with each other accurately on an ongoing basis. This will help serve the author and reader more effectively and strengthen the links in the supply chain.

ASA 2013: Ed Pentz on CrossMark - A New Era for Metadata

Horse burger, anyone?
Ed Pentz, Executive Director at CrossRef, provided an overview of how CrossMark provides information on enhancements and changes to an article - even if it is downloaded as a PDF to your computer.

With a slide showing a horse-shaped burger, Pentz observed no one knew what was happening in the supply chain and ingredients were mis-labelled. As a consumer it's hard to know what's verified. Third party certification such as Fairtrade or the Soil Association mark have arisen to help consumers. This is an important lesson for the scholarly publishing community.

Pentz is not talking about bibliographic metadata. This is about some of the things that are changing in broader descriptive metadata - what are users starting to ask? They are interested in the status of the content. What's been done to this content? And what can I do with this content?

Good quality metadata drives discovery, however, there are problems with metadata and identification. This is a challenge for primary and secondary publishers as the existing bibliographic supply chain hasn't been sorted, new things being added in, and this could potentially lead to big problems.

NISO announced two weeks ago standards for open access metadata and indicators. The detail is still to follow which will include things like: licensing; has an APC been paid?; if so, how much and who pays it? These factors will be particularly important to help identify open access articles in hybrid journals.

There are a number of new measures that have to be captured via the workflow. These include:

The FundRef Workflow
  • CrossRef has launched the FundRef pilot to provide a standard way of reporting funding sources. 
  • Altmetrics allow you to look at what happens after publication, looks at aspects of usage, post-publication peer review, capturing social buzz and getting beyond impact factors. 
  • PLOS has article level metrics - available via APIs.

What about content changes? Historically, the final version of the record has been viewed as something set in stone. We need to get away from this idea because it doesn't recognise the ongoing stewardship publishers have for the content.

Many things happen to the status of content - post-publication - including:

  • errata
  • updates
  • corrigenda
  • enhancements
  • retractions
  • protocol updates.

As we have heard throughout the conference, the number of retractions are on the rise. Pentz referred back to an article in Nature 478 (2011) on the trouble with science publishing retractions. The case is clear: when content changes, readers need to know, but there is no real system to do this.

In a digital world, notification of changes can be done more effectively, and that's what CrossRef is all about. Another challenge is the use of PDF: there is no way of knowing whether the status has changed. When online, the correction is often listed below the fold, even on a Google search. The whole issue of institutional repositories is also a factor.

What is CrossMark? It is a logo that identifies a publisher maintained copy of a piece of content. When you click on the logo it tells you whether there have been updates, is the copy being maintained by the publishers, where is it publisher maintained, what version is it and other important publication record information.

Taking the example of the PDF sitting on a researcher's hard drive, the document has the CrossMark logo. Click on it for an update on whether the PDF version is current. You can then link through to the clarification if it is there. It includes a status tab and publication record tab. The record tab is a flexible area where publishers can add lots of non-bibliographic information that is useful to reader, for example, peer review, copyright and licensing, FundRef data, location of repository, open access standards, etc.

Lots of things can be enabled by this such as Mendeley. Pentz showed a demo of how a plugin for Google might be written that flags CrossMark when you search. It was launched in April 2012 and has been developing slowly with 50,000 CrossMark pilot deposits since launch, with 400+ updates. They are working with 20+ publishers on CrossMark implementation.

Monday, 25 February 2013

ASA 2013: David Myers on Buying and Selling Content - Don't Forget the Small Publisher

David Myers of DMedia Associates focused on buying and selling content, but from a small publisher perspective. He provided some general industry statistics to provide context:

  • worldwide there are 25,000 scholarly journal titles from approximately 2,000 publishers
  • only 60% or so are indexed by some database
  • 3% are open access journals, but this will grow
  • historically, small range of well-respected content
  • now, ability to have a comprehensive search experience.

New(er) trends include:

  1. CLOCKSS/PORTICO, others - need to participate in a digital preservation initiative
  2. tablets, mobile - pay attention
    • 112.5m US adults own a tablet by 2016 - >third of US
    • Tablet adoption up by 400% (source: Nielsen)
    • Majority use for reading information
    • spend 14 hours per week with their tablets
  3. ebooks
    • 2011 $90billion global market
    • by 2016 - grow by 18.9%
    • but in certain parts of the world and certain customers - budget for ebooks is not even in the same part of institution - need to understand arbitrariness of where money comes from with some customers.
He believes that using grass roots direct marketing, that then lead to trials, still remains the primary vehicle for adoptions. Co-marketing of titles with a particular focus, thereby creating a certain level of critical mass, works. Novel or targeted offerings help you to stand out. Bespoke collections where the user chooses are also popular.

Myers advice to small publishers to thrive is to look at:

  • commissions and margins
  • pay attention to small details, figure it out and give it to the customer
  • help those who help you
  • be responsive
  • invest for the future
  • let renewals be handled by experts.

The key take-away points for publishers are about flexibility and remember who you are dealing with as personal relationships matter. External and political issues dictate the environment. Administrative issues are important. Last, but not least, focus on the biggest opportunities.

ASA 2013: Janet Peters on Order out of Chaos? Trends and Forecasts in the Scholarly Acquisitions Landscape

Janet Peters, Director of Libraries and University Librarian, at Cardiff University reflected on whether or not there would be order out of the current chaos for libraries.

She asked the question: how predictable are students and researchers?

  • Will numbers of applicants to universities continue to fall? December 2012 shows that applications are another 6.3% down: hitting universities hard in terms of income.
  • Will there be more demand for distance learning and part time courses - delivered globally?
  • Will research funding be focussed in a smaller number of centres?
  • How well will UK HE compete with global and/or private educational providers?
The market place for quality is changing. With student funding there has been a shift from government/tax payer to the student. There is no more money overall. Student choice is more evidence based with the use of Key Information Sets and the National Student Survey. The Research Excellence Framework will have significant levels of funding attached.

Library purchasing trends of note are:

  • more than two thirds of information provision expenditure is now on electronic resources compared to less than half five years ago
  • the average number of books added per FTE user was 1.1 in 2010-2011, which represents 2.0% of total stock, compared to 2.8% in 2000-2001
  • library expenditure as a proportion of total institutional income is going down
  • purchasing trends in Cardiff ebooks has risen
  • they bought 27,500 books in 2004-05, this was down to 15,000 2011-12
  • expenditure only dipping a bit, but the number buying is much lower.

So how do librarians keep everyone happy? They are adopting a 'just in time', not 'just in case' approach. Purchasing is based on evidence. The decision to retain (subscriptions or books) has to be justified. Statistics are essential: how much is x used by whom.

Anytime, anywhere access, for any number of readers is a priority. Ebooks or scanned articles and chapters for reading lists mean they are increasingly spoon-feeding students (particularly for first years) and disaggregating material. They are also exploring new models such as patron driven acquisition.

They are innovating through new services including federated search (LibrarySearch@Cardiff), rare books management, using social management, while looking at a rebuild, refresh and promote approach.

At the same time they are trying to improve the value from supply chain through various channels:

  • resistance to price increases
  • RLUK campaign
  • JISC Collections
  • Consortial purchasing (SHEDL, WHEEL) direct purchase from publishers (OUP has made good progress with country wide access while existing institutional subscribers still pay approximately what they did before)
  • Shared services
  • Consortial storage of journals UKRR
  • Licensing information - KnowledgeBase+
  • Cataloguing? Shelf ready: specialist hubs; master record
  • Library Management Systems?

Meanwhile, changes in publishing continue apace with open access, including The Finch Report with green self-archiving route and gold article payment charges (APCs). Peters made a plea to publisher to make it as easy as possible to access the transitional funding by getting invoices out as quickly as they can. She noted that the RCUK funding in April should make it easier. There is a growing marketplace for APCs, although this is dependent on author behaviour and national negotiations and Cross sectoral licensing (e.g HE/NHS).

So what is next? There are challenges around disaggregated and linked data. Students see separate bits of knowledge already and are not aware of the historic wrapper. This changes the whole issue of guaranteeing quality so they are having to train students to check provenance, etc. Is there a future of a journal as an entity? Is the future of bundles secure? And what about the future of metadata? Will that be open too? They also need to consider who provides access to research data.

Peters closed by considering how subscription agents can help bring order out of chaos:

  • They can broker APCs as an intermediary (see OAIG report)
  • Join Open Access Key (OAK)
  • Develop standards for data transfer between universities/funders/publishers on APCs
  • Create new business models and provide access to articles for library purchase (find the value-add compared to Google Scholar or the British Library) and curate and provide access to data.

ASA 2013: Robert Kiley on Scholarly Communication - A Funder's Perspective

Robert Kiley, Wellcome Trust

Robert Kiley is Head of Digital Services at the Wellcome Trust. They are a key advocate and funder of open access communication, and as Kiley wryly observed, open access is a topic that just keeps on giving.

The Trust has had an open access policy for several years. Each month they check compliance, currently running at 55%. As open access is not an option, but a grant requirement, they introduced sanctions last summer. From 1 April 2013, their policy requires CC-BY which he believes has created an awful lot of noise, some of it misinformed, some of it mischievous. If that is not offered, the research won't be able to go down the gold route, and will have to go the green self-archived route, which is not their preferred route.

Open access has become mainstream with support at the highest level of government. HEFCE will launch a consultation on how to measure OA for the REF 2020 (announcement here). There are developments in the EU and US. FASTR has maximum six month embargo and re-use requirements. He noted that with the announcement on Friday the US has gone for a 12 month embargo, which he believed was a step in the right direction. Kiley is a strong supporter of PLoS One and eLife, but was less enthusiastic about the publisher 'me-too' efforts.

There is a growing recognition that allowing everyone to read 'stuff' is important, but allowing computers to read 'stuff' is even more important. He noted that 48% of content added to PubMed Central in 2011 was included in open access subset.

He discussed the rise of intermediaries and noted that other than funding, one of the biggest problems researchers face when opting for open access is paying the APC. Providers are now stepping in to plug this gap including Open Access KeyCopyright Clearance CenterEBSCO and others.

The University of Edinburgh identified that c. £1741 is the average cost of an APC while the Wellcome Trust has found that c. £1797 is the average cost of APCs from the trust funded model. They are investing money in their own publishing activities, currently at 1.2% of their total spend, but they expect this to rise to 1.5% in the future.

Kiley refuted the assertion that gold is a UK-only phenomenon with the data that 22% of articles published in PLoS One were NIH funded. Hybrid articles are also being funded - 48% of articles routed through ACS "Author Choices" option were NIH funded. In 2012, the majority of published authors in PLOS were (in order of uptake) from US, China, Germany and UK.

One strength of the 'gold' model is around transparency. He believes there is a risk if we move from 'big deal subscriptions' to 'big deal APCs', the transparency benefits will be lost. He also suggested that the 'funder provides funding / researcher is not bothered by cost / publisher sets price' might be a dysfunctional market. He senses that authors need to be aware of price and they need to explore other funding models (e.g. COPE) while subscriptions need to take account of APC, perhaps with differential pricing.

Kiley closed by outlining how they still believe dissemination costs are research costs. The cost of 100% open access is probably around 1.5% of their research spend. The move to open access continues to gain momentum through policies, publishing ventures and intermediaries. Gold open access is not just taking hold in the UK and funders need to ensure that the APC model remains competitive.

New to ALPSP - Royal Statistical Society

We are to welcome the Royal Statistical Society as a full member of ALPSP.

ASA 2013: Herman Pabbruwe - Authoring and Publishing - It's Getting Better

Gabriella Karger introducing Herman Pabbruwe, CEO of Brill
Herman Pabbruwe, CEO of Brill, gave a talk that was optimistically titled 'I've got to admit it's getting better.' He provided a view from humanities publishing to contrast with earlier scientific sessions.

He believes that publishers are coming to grips with destructive technology. They are taking stock and thinking about whether people appreciate their roles including how sustainability. They are starting to take advantage of technology and the richer environment and mobility, while developing multiple business models (some of which have are revived from before).

There is a growing international demand and revival of unique library collections. There is increasing value in data so they have to reflect on how to bring all the necessary information together to enable appropraite validation. This is increasingly difficult and expensive to achieve and you need provenance and context: an essential step, otherwise it is just a collection of items. This is the challenge of publication at source.

They are trying to coach a younger generation of researchers, but there is a huge problem with sharing notebooks as people don't want to share something that isn't right (reflecting what Jeremy Frey represented in his talk before).

Pabbruwe reflected that some things don't change so much in research publishing. There is still tenure. The attitude of publish or perish still prevails and you need to build in twigging or developing new researchers. The costs of publication are still there as is sourcing for the user.

Some things still need to work: the perception and acceptance of a 'new generation' of products; workflow and efficiency; copyright (green) and definition of patrons; access rather than ownership and the effects of pick and choose (i.e. Patron-Driven Acquisition); across broad digitisation; and library and open access budgets.

Issues in publishing abound: from the chicken and egg scenario for e-product to archiving (open access, open data, MOOC). There are technical hurdles (e.g. Unicode bandwidth) to publishers' public image. And copyright concepts continue to present challenges.

A possible direction to focus on might be:

  • focus on key subject areas
  • have a balanced portfolio of journals, books, reference works and primary sources
  • be platform independent (consider price differentiation?)
  • have more empowered commissioning staff working with a bigger and stronger sales force
  • re-enforce the relationship with end user communities
  • focus more on Asia.

Publishers need to have an increased technical focus on standards, proven technology (also outside industry) and XML work flow for rich digital products (e.g. e-humanities). They should consider selected areas of software development (thesauri, proper names, pattern recognition) and think about how to enrich the author experience through services and direct links. Brill have created a proprietary typeface based on Unicode.

He hopes it will get better all the time, that publishers will use resources to create and make a difference and develop a multi-faceted and direct relationship with key institutes of higher education. Publishers need a flexible attitude towards copyright, and also be loyal to partners, but professional.

ASA 2013: Jeremy Frey on Reputation, Responsibility, Reproducability and Reuse - A Scientist's Perspective on the Role of Publications


Jeremy Frey addresses the ASA conference audience
Jeremy Frey is Professor of Physical Chemistry at the University of Southampton. He covered a scientist's perspective on the role of publications and pointed out that the open access debate is actually an issue about the economics of who pays, as someone has to. However, he believes that there are different consequences around accessibility when you consider who does.

The consequences of not having open data is that your work cannot be checked by someone as bright as you. This could be an error made in good faith. There are ways of validating data, but they are dependent on having access to the data and a willingness to host that data. In his opinion, the Royal Society report has come up with the best phrase so far: 'intelligent open access to data'. It's not just access to the data, but it's also critical to record somewhere how to get use out of the data.

Some projects need large amounts of data from the literature. Access can be an issue. A digital repository is one answer, but does this lead to subversive and furtive sharing and exploitation of data in a virtual space? Can people understand how to use it?

One of the issues is that people in the scientific community put all emphasis on the answer, not the working out. And yet the process of coming to an answer is the most important part: 'methods are as important as the data'. The BBC 'Climategate' campaign is a good example of the benefit of doing this. Even Faraday in his notebooks included a huge amount of detail about how he went about his experiments.

What are researchers trying to get out of the whole publication process? Dissemination, reputation, and advancement. The scientific information supply helix was historically from researcher to literature to abstraction/search to literature. Currently you can add open notebook science. This makes the flow go from researcher to the web(?) to search to researcher, but this is rarely the only dissemination route.
There is a need for validation, traceability and accountability for recognition, promotion and future funding.

There are some journals that would limit the number of references within article as they are constrained by page size. This is a real problem in the interdisciplinary world. However, this is shifting as online removes space restriction. 

There are a number of problems with interdisciplinary research and dissemination. The methods and emphases varies with discipline:
  • journal versus conference
  • pure versus applied
  • research versus application
  • authorship and acknowledgements
  • who goes on the paper?
  • who is mentioned in the press release?
  • do you acknowledge the technician?
  • e.g. maths is like humanities - student works on their own while science is team work.

He believes that researchers also need to reflect on how they communicate the data. The Smart Tea Project was a way to create a harness to record data as you go along, but it was a bit overbearing for students. Then they moved to LabTrove which has more of a blog style with a highly interlinked system. Other issues to consider are diversity and disability. Another example is the BlogMyData Project. This type of project has an impact on researchers, can lead to a higher quality record, easier collaboration, improved planning, better proactive interaction and can run in a completely open or closed way.

Access to the discussion and the means of reproducibility of the analysis is really important to the way the community will be working in the future. Things have to change and at pace.

ASA 2013: Fiona Godlee on Who is an author and why does it matter?


In the first session at the Association of Subscription Agents and Intermediaries conference in London, Fiona Godlee, Editor-in-Chief of the British Medical Journal discussed the challenges of authorship credit and the move towards contributorship. Who is an author and why does it matter?

Life and research is far more complicated than it was. We now have multi-centre/author/function research going on. There are questions around who did text, the statistics and the lab work. Research has always been, but is even more so today, competitive with researchers fighting for first author place. Then there is the commercial interest, use of journals for marketing by medical/pharmacy industry

The International Committee of Medical Journal Editors has guidelines on authorship credit and have added to these to try and tackle some of the emergent issues. The idea of contributorship revolves around credit and responsibility, being explicit about roles and contributions and revisiting responsibility for complex studies with guarantor role.

The things that bother editors are:
  • bias
  • data manipulation or suppression
  • dupliate publication or salami publication
  • fabriation of data
  • falsification of data
  • gift authorship
  • plagiarism
  • self delusion
  • undeclared conflict of interest
  • wrong observations, analysis or references.

Source: Stephen Lock, BMJ
You can map these across to what constitutes errors in good faith, 'trimming and cooking' or fraud. Stephen Lock at the BMJ mapped these issues (see Fiona's slide, right).

The reality is that the lack of open data is creating major problems with verifying research. Questions around Tamiflu are a case in point, with contradictory statements confusing the picture, which was compounded by the lack of data available for checking.

Concealment of data is a very serious offence. In the US you have to register your trial and publish data within a year of the end of the trial by law. Yet a large proportion do not. The BMJ are trying to do their bit by insisting that clinical trial data must be available to be published alongside the article. 

With the BMJ open data campaign, Tamiflu is the test case. However, industry funded trials are not the only transgressors. Non-commerical trials are just as guilty. But there are signs of compliance. The UK HTA programme is an exemplar - they badger authors, withhold funding and publish on an open access website. This approach has resulted in near 100% compliance.

New to ALPSP - Chinese Laser Press

ALPSP is pleased to welcome Chinese Laser Press as a new Full Member.

Monday, 11 February 2013

ALPSP member strengthens publishing department



The Society for General Microbiology (SGM) is pleased to announce two managerial appointments to its Publishing Department.


Kerry Cole has been appointed as Sales and Marketing Manager. Kerry joins SGM Publishing following previous roles at the Stationary Office and Portland Press. 

Rachel Walker has been appointed as Publishing Operations Manager. Rachel transitions to this role from her current position as Managing Editor for Microbiology. 

Kerry and Rachel join the Society’s Publishing Leadership Team headed by Leighton Chipperfield, Head of Publishing.  Leighton said, “I am delighted to welcome Kerry and Rachel to the leadership team. These appointments are key to the execution of our forward publishing strategy."  

-ENDS- 

Notes for Editors 

  • For further information please contact Dariel Burdass, Head of Communications, Society for General Microbiology. Tel +44 (0)118 988 1843 Email d.burdass@sgm.ac.uk
  • The Society for General Microbiology publishes four key journals of international repute: International Journal of Systematic and Evolutionary Microbiology, Journal of General Virology, Microbiology and Journal of Medical Microbiology (all monthly). The journals contain high-quality research papers and topical review articles. The online versions are published with the assistance of HighWire Press, with many added features and functions to aid the reader, and can be accessed via www.sgmjournals.org. 
  • SGM is a membership organisation for scientists who work in all areas of microbiology. It is the largest learned microbiological society in Europe with a worldwide membership based in universities, industry, hospitals, researchinstitutes and schools. The SGM publishes key academic journals in microbiology and virology, organises international scientific conferences and provides an international forum for communication among microbiologists and supportstheir professional development. The Society promotes the understanding of microbiology to a diverse range of stakeholders, including policy-makers, students, teachers, journalists and the wider public, through a comprehensive framework of communication activities and resources.

Friday, 8 February 2013

New to ALPSP - IOSH

We're pleased to welcome the Institution of Occupational Safety and Health as a Full Member of ALPSP.

New to ALPSP - Spandidos Publications Ltd

ALPSP is pleased to welcome Spandidos Publications as an Associate Member.

"The 3 Rs: Reach, Readership and Revenues"



There’s still time to book for the ASA Conference in London 25-26 February: http://subscription-agents.org/conferences/asa-conference-2013-booking-form . Non-member rates are only £440 (inclusive of meals); discounts apply for delegates from ASA member companies and librarians.

The programme is organised around four sessions covering the main stages of supply of content from authors to readers: Authoring and Publishing, Buying and Selling Content, Discovery and Metadata, and Content Usage and Quality. Also scheduled for the conference is a panel discussion involving leading figures from the academic content world which will debate both traditional and open access business models, trying to answer the question, "Who pays: government, philanthropists, funders, authors, institutions, libraries or readers?"

The ASA conferences underscore its members' credo: maintaining the highest standards of business practice, providing excellence in service and support, facilitating transfer of knowledge to a growing global audience, and helping sustain the knowledge economy. Covering the hottest topics from publishing, intermediary, library, consortia, and emerging technologies, the conference is set to continue its unique and respected tradition of debating and shaping the future of our industry.

If you require any additional information, please contact Nawin Gupta at info@subscription-agents.org .

PSP 2013: The Professional Book: Past Its Sell-By Date?

Shanahan, MacInnis, Usatine and Grillo
Carrying on the theme from Harrison Coerver's 'are associations dead?' session on day one, Scott Grillo, Vice President and Group Publisher at McGraw-Hill Medical chaired a session asking are professional books due for extinction?

Unsurprisingly, the answer was no. Well, sort of, but you have to think about all devices or technologies now, not just print.

James F. Shanahan, Editor-in-Chief and Associate Publisher at McGraw-Hill Professional feels you have to take into account the evolution of print products. Medical reference information has evolved from monographs, textbooks and list summaries to the shelf bending tome of the 90s and 00s, through to concise print reference, then in the present, comprehensive databases delivering quick answers at point of care married to workflow resources.

If you think hard about how best to solve a problem, you can be rewarded - even in print, although it's not so sexy these days. There is a changing value proposition for medical content and the future will be based around the issue of problem solving. Print does remain viable, but you may find that for business reasons (i.e. profit margin, subs renewal rate) you might find it being dropped.

Professionals face the same problems: time, money, patient safety, quality and outcomes, documentation of procedural skills, licensure and certification. But they also face newer problems: documenting competency training, figuring out how to relate income to outcomes, and filtering the overwhelming amount of new content that has no structure or borders. Publishers also need to consider where folks go to solve their problems: Google, Wikipedia, PubMed, social media, viral media, associations, lecture capture systems, as well as faculty with time and gadgets.

The future of medical content is digital. Anyone who thinks it isn't is fooling themselves. Content in context of everyday work is essential. Medical content will be increasingly oriented towards institutional customers as they move towards institutional or large group practice. The future of medical content is also multi-platform and re-usable. If the title of a new book proposal sounds like a Google search result, it will probably struggle to stand out in the market.

Key questions to consider are:

  • Are we talking literally about print books, or about content that is book length but delivered digitally?
  • Is medical print increasingly archaic, a thing of the past? Is it truly 'past its sell-by date?'
  • Do publishers want to keep investing in medical print?
  • What do they lose if they conclude that print is past its sell-by date?
  • Do health professional want to keep using medical print?
  • Do technology partners replace medical print, or add to to?

Dr. Richard Usatine, a Family and Community Health Professor, VP of a media app company, and Editor-in-Chief of Family Medicine Digital Resources Library has a strong attachment to print books. He contrasted their survival to the demise of vinyl. While he clearly holds a traditional view of the value of print, he also acknowledged that there is a time and a place for all different types of medical content, ultimately coming across as platform agnostic. He cited Epocrates as the number one application driving doctors to the smart phone. He's a user of it and finds dosing of drug interactions quicker than he could on the internet or in a book.

Usatine observed that students are now given a selection of electronic books from a selection company for their course. They like that they are light, on a laptop, ipad or tablet. They don't necessarily know the names of the authorities when they first some in to the school, that is something they pick up through their course. They want quick access to information and if they are given a lot of books as part of the fees, it still means the book has value, but no longer has (literal) weight. A variable set of media is being used. Pedagogy is the art and science of education. People will always grab the best tool for the job. Even if it is digital, it doesn't mean print will go away. You can build better tools, but still there is a value of the paper for the look, feel and browsability.

Matt MacInnis, CEO of Inkling, a digital text tech company, was more forthright: he doesn't care if the book is in demise. Remember what publishers do: they curate and produce quality content. The reality is that whatever device you select to communicate with a subject expert, you have to make it trusted. Consider if the reader is looking for a specific answer to a question or if they want to peruse? The book may be a better device for a particular question or problem, in which case, go ahead and keep printing them.

Things are going to continue to change. Get used to it. Talking about an ebook is silly. Go online, look at an ebook and ask yourself why is it ebook incapable of doing a table? We need a whole new set of methods for this new medium. Ebooks will evolve and develop to become a richer experience. InDesign still asks you to confirm inches and margins when you go to 'new'. Trust and usability has to happen across new platforms. His recommendation to editors is whatever the discipline, be really honest with yourself about what problem you are solving and define it by what that person is trying to do. Then think about format. Be more creative and more aggressive about how the platform you choose solves these problems.

Grillo observed that people often point to pedagogy and authority, and that has meant books continue to survive. He asked does that still hold true? Do students now say 'it's a good answer because I found it in a book?' or do they say 'It's a good enough answer because I found it on wikipedia?' Shanahan disagreed with the idea that pedagogy will keep the book alive. He believes it is third or fourth on the list. Authority is the key thing. When Grillo asked is pedagogy a sub-set of usability? (It's not so much the death of the book, but redefining the book), MacInnis responded by saying that the pedagogy thing is not so much a decision on moving from book to digital, it's more about the technology hasn't moved on a bit.

MacInnis made a powerful closing point: it's dangerous to think in context of print to transition to digital. This is a transition from only thinking about one kind of media, to thinking about many different types. It's a transition in thinking, not a transition from print to digital.

Thursday, 7 February 2013

PSP 2013: The Future Value in the Professional Association

The social media challenge ACS tackled
In the first session at the Professional Scholarly Publishing conference, Harrison Coerver asked if associations still have a future. The following panel outlined the value they provide in the internet age and what services they will have in 2018.

Madeleine Jacobs, CEO of the American Chemical Society (ACS), provided an overview of their membership and services. They have over 163,000 members, with 187 local sections and 32 technical divisions. 87% of members have degrees in chemistry and more than 60% of members work in business and industry. 15% of members live outside the US and the offices have two main locations in Washington DC and Columbus, Ohio. They have a governing board for publishing and run - wait for it - 485 programmes. That's a big association with a lot of programmes.

The ACS has four strategic goals:

  1. Provide authoritative and indispensable information
  2. Advance members' careers
  3. Improve science education
  4. Communicate chemistry's value
How are they tackling the Race for Relevance? One of the key initiatives they have focused on has been to set up a tech trends roadmap. They have a group of staff who monitor and review on an on-going basis what the latest technology can deliver and what is relevant for members. They identified four challenges and four solutions:

  • embrace social media
  • make information portable
  • expand electronic publishing
  • make ACS national meeting presentations more accessible.

They have launched award winning apps, expanded their electronic publishing, focused on providing information through the Chemical Abstract Service and continue to build more virtual services. With 92-93% of revenue coming from publishing, it is critical that they continue to adapt and deliver value.

Her final piece of advice? Hire the very best tech people you can find. Steal people from other organisations who have done well. And don't forget to empower great people you already have on staff.

James Prendergast, COO of the IEEE, outlined the historic merger between AIEE and IRE, driven in the main part because the IRE were moving a lot faster through their international reach and open welcome to students and young professionals, something Coerver advocates strongly. They were quick to adopt advances, the AIEE recognised this, so they merged in the 60s.

The IEEE has more than 429 members globally. Their Xplore Digital Library has more than 3.25 million articles, they have 900 active standards and 500+ standards in development. They held 1396 conferences globally in 2012 and have 572 already in places for 2013. (Cue sharp intake of breath from the audience.) They have interactive HTML full-text articles with unique and active features - not just vanilla HTML.

Jacobs, Prendergast and Dylla with Coerver
The IEEE has completely endorsed open access and provides options for all authors to do this. All their technical journals are now hybrid with subscription and open access options. They have a number of journals that are now fully open access and announced a new mega-journal in 2012 that will be fully open access and multi-disciplinary.

They launched OpenStand for their standards in August 2012. Informational webinars provide new ways to deliver content. Pre-university outreach portals such as TryComputing and TryEngineering are proving to be a big hit. TryEngineering received 12 million hits in 2012 with the average time on site exceeding 10s of minutes. They are fostering engagement through social media and have expanded online impact through an extreme programming competition and the IEEE Day.

Prendergast advised delegates to focus on mobile. Your services have to be at the right time and in the context of when and where members want to receive that information. As an organisation they provide too much information. They have to focus on getting it right and making services relevant and timely.

Fred Dylla, CEO of the American Institute of Physics outlined the history of the association and the particular, unique circumstances that make their situation unique and challenging: that they are an umbrella body of ten physics associations. They have a membership of approximately 165,000 scientists with 75% based in the US and 25% internationally.

In 2008 they undertook a key strategic re-evaluation. This included divesting the offer of publishing services to members. Post-2008 there was a lot of self reflection. He described their governance structure as akin to the Senate and House of Representatives structure which led to the governance review post 2010 they conducted with the support of BoardSource. This led to them recently establishing the wholly owned subsidiary AIP Publishing LLC.