Wednesday, 21 August 2013

Thad McIlroy on Demystifying XML


Thad McIlroy is a publishing consultant and author, currently writing and blogging at The Future of Publishing. Thad, along with Carol Wolfe from the American Society of Health-System Pharmacists, will be hosting a webinar on Demystifying XML next month. 

Here, in a guest post, he reflects on just what it is that makes XML so pesky yet so important, and how they'll demystify it in the webinar.

"XML was created around 1996; the standard XML 1.0 was first defined in 1998. PDF – the Portable Document Format – was introduced by Adobe in 1993 and became an ISO standard in 2008.

I’m sure that like most publishers (and their teams) you don’t give PDF much thought anymore. You easily convert files into PDF and exchange those files over the web without any fuss. You don’t need to read the 756-page PDF 1.7 specification.

Why can’t XML be so straightforward? 

Because it’s complex, very complex. That complexity makes it extremely powerful. It also makes it hugely difficult for the non-technical to get their heads around how it works and what it enables. I’m not trying to suggest that PDF and XML are equivalent. PDF concerns the look of pages and XML their structure. But they’re part of the same publishing universe.

XMLers will tell you that XML is a simplified version of SGML. 

That’s like saying that xxx is a simplified version of yyy. Gee thanks. They’ll also tell you that the XML DTD syntax is one of several XML schema languages, but that XML Schema utilize an XML-based syntax, whereas DTDs have a unique syntax held over from SGML DTDs. Gee, double thanks.

Nonetheless I’ll tell you that if you’ve somehow managed to postpone adopting an XML workflow you can avoid the challenge no longer. You’re going to have to crawl inside XML, enough to get your bearings so that you can make strategic choices of how important XML will be to your future publishing workflow. 

What you can achieve with XML cannot realistically be duplicated any other way. 

And today, where a handful of output formats have become a multitude, the only way to find publishing nirvana (OK, nirvana might be a stretch) is to add XML structure to your publishing mix.

We’re fortunate that joining the webinar will be Carol Wolfe, VP, Publications and Drug Information at the American Society of Health-System Pharmacists (AHSP). Carol can tell it like it is. She explained to me that her view is strategic, not über-technical, and I thought: perfect. 

Because more than anything else we all need to recognize that XML is now core to our publishing strategies.

Carol will describe how her core association publications, like the 3822-page AHFS Drug Information and the 1,280-page Handbook on Injectable Drugs were destined to migrate online and why XML structured coding was essential to facilitate the transformation. She’ll describe how she’s managed moving these publications online while also managing online journals and ebooks, while optimizing her publications also for tablets and smartphones.

We’re not going to head too deep into the XML weeds because that would take all month (you might want to Google “introduction to XML” before the webinar if you’d like to brush up). We will introduce XML in non-technical terms so we can describe, in everyday publishing language, why demystifying XML is worth every ounce of effort for you and your staff."

Demystifying XML: A practical guide and case study will be held online on Wednesday 25 September, 2013, 11:00-12:30 ET, USA, 4:00-5:30 UK, 5:00-6:30 CEST. 

Don't miss it, book now

If you can't make the time, you can register and still receive the recording to view at your leisure.

The webinar is sponsored by ALPSP and Copyright Clearance Center

Tuesday, 20 August 2013

Turpin Distribution - Welcomes Adam Marshall

20 August 2013
PRESS RELEASE - FOR IMMEDIATE RELEASE -

Turpin Distribution is delighted to announce that Adam Marshall will be joining their organisation effective August 2013.

Adam's overall responsibility will be to manage, over the next eighteen months, a major business project. Having worked within the publishing industry with Portland Press for 20 years as Group Head of Marketing and Customer Services, Adam brings with him a wealth of knowledge and broad experience in both books and journals.

This appointment is another step forward for Turpin, signifying their commitment to clients and to their business's growth. They remain fully committed to their current business model, whilst seeking to broaden their range of services to their clients and embracing the digital age.

For further information please contact:

Neil Castle
Operations Director
neil.castle@turpin-distribution.com
T: +44 (0) 1767 604 868

www.turpin-distribution.com

Tuesday, 13 August 2013

Kurt Paulus on ALPSP International Conference: Making change happen

2012 ALPSP Awards winners
This is the fourth and final post in a series of reflections on the 2012 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, book for the 2013 conference now.



Making change happen

References to what publishers might do to ride the crest of the wave of change have been scattered throughout the conference, so there is no lack of ideas. Indeed, as Stephen Pinfield noted in the final panel discussion, there is more experimentation now than ever before with technologies, services and business models, as publishers are overcoming their fears of the uncertainties  change brings with it. Anita de Waard put it this way:

  • Experiment all over the place.
  • Support scientists working at the forefront of information handling and networking.
  • Join fora where scientists, publishers and librarians cooperate.
  • Form partnerships and alliances where you cannot manage on your own.

But don’t expect your cherished organizational structures to survive the change process. All publishing roles will have to adapt. Close interaction with your customers and users may be hindered by the still prevalent functional set-up of many organizations. Skills – technological, analytical, social – will need to be reinforced. IT professionals will have to become more business aware and outward facing, and the concept of the IT department may disappear. All this requires complete reorientation of organizations.

“Work on your business as well as in your business.” Arend Welmers

Lost? Help is at hand from company doctor Arend Welmers of Quantum90 who gave the second keynote presentation with some simple messages:
  • A common disease in organizations is functional thinking. Your role in your organization is not your job title.
  • The job you may not be doing is helping your company to progress and make more money. You are hired to make the company successful.
  • All organizations, whether for profit or not, are competing in a marketplace.
  • ‘Employee engagement’ is just consultants’ eyewash. Employees must learn to behave as if they own their company and are individually responsible for making it succeed.
Arend’s recipe is called ‘Open book management’, where everyone is involved in running the company and has access to all information about company information, especially financial, presented in such a way that they know which levers they have to pull to improve performance and so that they can see the results. And if they are successful, why not let them share in the success by moving towards shared ownership?
Arend is a challenging and enthusiastic speaker, so much so that he was able to devote less time than he would have liked to his case study, the Springfield Remanufacturing Corporation, where the crux of his message is. In discussion it emerged that he had been working with the American Institute of Physics for a couple of years, and that might have been a better case study for this conference. The testimonial from Fred Dylla, CEO of American Institute of Physics, was complimentary!

Endnote

The winners of the ALPSP awards are given on the ALPSP website, but it is worth noting that the ALPSP Award for Contribution to Scholarly Publishing 2012 went not to an individual but to an organization, CrossRef, with the following citation (abbreviated):

“The Council of ALPSP was delighted to make this award to CrossRef, the independent not-for-profit organization set up and run by publishers to facilitate the linking between scholarly publications using the Digital Object Identifier. Launched in 2000, this system grew rapidly to hold 3 million DOIs in just a year and now holds metadata for 55.5 million conference proceeding articles, book chapters and journal articles right back to the first articles published in Philosophical Transactions in 1665. With over 4,000 participating publishers, CrossRef’s reach is international and it is very well regarded not just amongst publishers, but also the literary community and researchers.”

This happily reinforces one of the conference messages, that unity is strength and collaboration helps overcome weaknesses.



Kurt Paulus, 2012
Booking for the 2013 conference is open.


Monday, 12 August 2013

Kurt Paulus on ALPSP International Conference. Data, data, data!

This is the third in a series of reflections on the 2012 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, book now to secure your place at the 2013 conference.

Data, data, data

Though not an explicit theme of the conference, speaker after speaker emphasized the need for scholarly publishers to be on top of the information they have, or should have, about the behaviours of their users and customers. Mark Ware enjoined us to use extreme analytics to make sense of the customer and user data we have, indispensable for business-to-business companies but equally important fort the rest of us. Tom Taylor, talking about global consortia and library markets, suggested that consortia deals continue to be healthy and renewed provided we understand the local factors across the world that influence decisions: investment or retrenchment in education and research, general economic climate, open or closed consortia, political stability or lack of it and, of course, direct customer contact and service.

Matt Rampone in the mobile session urged publishers to invest in analysis of their readers. Forcefully, Charlie Rapple encouraged them to get to know their audiences, articulate questions they want answered before doing market research: what are the users actually doing, what devices do they use, what are their problems: evidence-based strategy formulation!

And, as Tracy Gardner of Simon Inger Consulting reminded everyone, there is the fourth scholarly publishing survey, currently in progress, that will be required reading when it comes out.

Open access

The Budapest Open Archives Initiative is 10 years old. Pressure from researchers, funding bodies and governments on publishers to open up access to their information has been intense and has even spilled over into the daily press. Although open access journals are not new, progress has been slow as publishers struggle with the new business models required to make sense of them. The green OA model – author deposit in institutional or other repositories after an embargo period – has been most in evidence, with funding agencies increasingly mandating this. The main argument here then is over the embargo period, ranging from 6 to 18 months.

This is the model chosen by the World Bank for its publications, a relatively small part of the work of the world’s largest development agency, so the business impact of open access is not too severe. For Springer, one of the largest STM publishers, the situation is quite different. Nevertheless, Springer has been moving gradually towards green open access, as Wim van der Stelt explained, helped by the acquisition of BioMed Central in 2008. Springer Open has 100 journals across all subject areas, has 405 members in 46 countries, includes books, and 15% of Springer articles are now open access. The embargo period is 12 months. Price adjustments are made even for selected hybrid journals depending on their OA content. Prices may go up as well as down and they “do make money”.

“Don’t forget: Open Access is just a business model, so keep up with your traditional services.” Wim van der Stelt

Wim advised doubters to be courageous and provide hybrid options, transfer subscription journals to open access, start sister journals and sponsored journals and launch fully open access new journals; also be creative with financial models for society journals, for example reducing or forgiving charges to members for one or more of their papers.

Gold Open Access – fully open access from publication, with authors paying a publication charge – has been slower to get off the ground although there are some early gold OA journals that are now quite well established. Three things have helped to accelerate development. Firstly there has been a gradual increase in the publication charge to a level that now makes business sense. Secondly, some funding agencies have agreed to pay the charge out of research grants. Finally, governments have slowly begun to shift policies, most notably the British government which accepted the recommendations of the Finch report, supporting gold OA, earlier this year, The European Union - green with six month embargo or gold without agreement on funding - has yet to be as bold, and things are not moving very rapidly in the USA either.

Whatever the outcome, the social sciences will move at different speed than the physical ones because of different funding arrangements.  The transition will also bring new administrative issues for author publication charges by institutions as the system migrates, particularly with authors from different institutions.

Freemium: a new business model?

Disruptive as the open access model may appear, as yet it has not yet led to fundamental change in the economics of scholarly publishing. As Mary Waltham pointed out, E-Biomed which morphed into PubMed Central was launched in 1999 to fierce opposition from publishers. Yet progress towards full open access has been quite slow and, as Thomas Taylor reminded us, the subscription based consortia model continues to be strong. At the same time, Cameron Neylon cautioned, in the long term there isn’t much money in controlling access to content, but there is a lot of money in tools and platforms to enable access. This is an opportunity for publishers to produce premium services.

How this might work was illustrated by Xavier Cazin of Immateriél, drawing on actual e-book sales experiences in France. Immateriél  has found that public domain books often sell at a price, OECD books sell at £11.99 through a bookshop even though they are free from the OECD website and discount offers bring in extra revenue even though the content value does not change. What customers are willing to pay for is ease of access: easy from where I am, readable on my current device, easy navigation through content. They are also prepared to pay for reading rewards: rich and well designed environment, content and curation appropriate to my current needs and sharing functionalities. The two combine to give ‘reading comfort’ which relates directly to the price people are prepared to pay. It follows that there is a job to do for publishers: find out exactly, for your content and your customers, what you need to do to produce products and services that provide this reading comfort.

The model of free+premium = freemium is a combination of a large, addressable audience whose needs we identify through analytics, to whom we offer free versions with a clear value but also premium offers with added value.


Kurt Paulus, 2012
Book now for the 2013 conference.

Wednesday, 7 August 2013

Kurt Paulus on ALPSP International Conference. Key changes, challenges and opportunities

This is the second in a series of reflections on the 2012 ALPSP International Conference by Kurt Paulus, former Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite, book by Friday 9 August to avoid the late booking fee.

Key changes, challenges and opportunities

Change is all around us yet the constants traditionally underpinning scholarly publishing largely remain in place: quality control, trusted brands, citability, even mainstream business models like the big deal. Perhaps reassured by that, we can afford to be less fearful when confronting the many changes that are coming our way. There is pressure from governments to review copyright legislation. Governments and funding agencies want to see much more open access than we have been willing to entertain until recently. Technical changes, not necessarily designed with publishers’ needs in mind, are racing ahead and we are forced to go with the flow.


We sometimes forget that our customers also face change. Librarians confront technical and resource constraints and increasingly have to demonstrate how they add value. Authors and readers suffer severe time constraints and have to change their workflows to be most effective. Both need our help. Change is now a constant yet publishers’ reactions can be coloured by fear of it, causing them to cling to outdated models and overlook the undoubted opportunities that change brings with it. ‘Disruptive technologies’ can have a positive impact.

Mobile

Mobile technologies and social networking techniques are ubiquitous and all around us. The growth of consumer-focused mobile technologies such as tablets and smart phones did not initially alert us to the benefits that this might bring to scholarly publishing. Researchers are also consumers, however, and increasingly demand access to their information wherever they might be.

“The scholarly article in 140 characters? No!” Plenary 2 title

While publishers may have been slow off the mark, mobile access is increasing and ‘Apps’ are springing up apace.  Matt Rampone of HighWire gave an overview: in the UK in August this year, over 13% of total online use was via mobile – smartphones, tablets and also e-readers. iOS is the dominant operating system with Android coming up fast. PDF is the format of choice for readers. Average time spent on a site is about 50s (a little longer for a selection of Apps). Mobile usage has a doubling time of 12 months! Rampone’s advice:

  • Invest in mobile.
  • Invest in analytics (know your readers).
  • Create good experiences.

It is fairly clear that publishing on mobile platforms does not yet require a rebirth of the scholarly article but is essential if publishers are to keep pace with readers’ changing behaviours and does give the opportunity to organize and present content in new ways. We mustn’t forget, Tom Reding of BBC Worldwide reminded us, that we are in the business of insight, not journals. The carrot for embracing change is that mobile users are more willing to pay for additional services than desktop users.

Mark Ware’s presentation based on a series of case studies carried out for Outsell confirmed that publishing to mobile platforms is increasingly settling in with STM publishers, starting with the medical and healthcare areas but spreading more widely. The publishers involved included BMJ, Elsevier, Nature, OUP, Wiley-Blackwell and others. Beyond the technical and presentational issues are business ones such as authentication of a user who wants to access a website from a mobile device from within their institution or while on the move. The solutions for RSC Mobile and Oxford Journals Mobile are slightly different for example, and no doubt evolving in response to reader reaction.

Discoverability

There is a data deluge in scholarly publishing, suggested Sophia Ananiadou, chairing one session: information overload, but is that the right metaphor? The deluge is a feature of a system where increased research funding leads to increased publishable output and, even with the most rigorous peer review, to  a growing mountain of stored information. How do we retrieve the stuff we need to progress research further?

“The problem is not information overload, nor filter failure; it is discovery deficit.” Cameron Neylon following Clay Shirky

The issue is not a new one but the scale is, according to Harry Kaplanian, who took us back to card catalogues and OPACs before moving on to more contemporary techniques of finding and retrieving information. We have since moved on to web search, standard references, author generated key words, metadata, A&I services and aggregators to make searches more exhaustive and reliable. A lot is being learned and there is an increasing convergence. Publishers need to be prepared to serve a variety of reader types looking for information through a number of different ‘discovery channels’ (Simon Inger), and need to be on top of their user statistics so that they can better serve their readers and also share their user data with librarians to support their case (value added again).

Oxford University Press has its own Director of the Discoverability Programme, Robert Faber, charged with increasing discoverability of all OUP content across subject areas. This includes among others:

  • Free content outside the paywall – abstracts, keywords etc
  • Improved MARC programmes
  • Enhanced linking between products and services
  • New mobile sites for Oxford Journals
  • Optimized search engines
  • New approaches to library discovery services
  • Analytic tools for tracking user behaviour.

At the top of the tree is the Oxford index, a standardized description of every item of content in one place, an evolving Oxford interface and a way to create links and relationships between content elements.

We are beginning to move towards text and data mining, from a single publisher’s output towards the whole growing corpus of accumulated information, i.e. from looking for a single needle in one haystack to finding a collection of compatible needles in a whole field of haystacks. Anita de Waard, Director of Disruptive Technologies at Elsevier set the scene by talking about working with biologists on how scientists read, how computers read and how they might come together to discover relevant and reliable information rather than just isolated research papers. She sees an evolving future of research communication where researchers compile data, add metadata, overlay the whole thing with workflow tools, then create papers from this material in Google.doc accessible to editors and reviewers, all in the ‘cloud’. Publishers? – we provide the tools!

John McNaught from the National Centre for Text Mining illustrated some of the techniques for adding more value to discovery: natural language searching, searching for acronyms, checking for language nuances, looking for associations, all designed to peel away layer upon layer of increasing complexity to turn unstructured text into structured content linked to other knowledge.

“The value of a network is proportional to the square of the number of connected members.” Metcalfe’s law

Networking and the semantic web continue to be buzzwords. Information is dispersed in many different places. To make sense of it we need structure and context, a resource description framework that identifies objects and connects them, and the appropriate vocabulary. Knowledge networks – associated communities that work on specific topics, linking to move on to a different level – are the next level up from networks of individual papers and reports, according to Stefan Decker.

So what’s the problem?

All this is fascinating work going on at the moment in academic or similar institutions. So will all our problems be solved soon? Not in a hurry, according to Cameron Neylon of PLoS, unless publishers change their ways. The majority of publishers behave as broadcasters of information and are still not thinking of networks of people and tools. The tools publishers have provided are not adequate, and often licences prohibit text mining of material to which the reader already has access. This is an opportunity for publishers to produce premium services. The hardware tools – mobile devices – are already available and are powerful.

“Publishers are too focused on controlling access.” Cameron Neylon

The bottom line for all these speakers is open access, that is having metadata and full text freely accessible with licensing arrangements that permit text mining, if the dream of improved discoverability for researchers is to be achieved. If there is a straw in the wind pointing to how publishers policies may develop it is the recent agreement by ALPSP, STM and the Pharma Documentation Ring (PDR) on a new clause for the PDR Model Licence:

“Text and Data Mining (TDM): download, extract and index information from the Publisher’s Content to which the Subscriber has access under this Subscription Agreement. Where required, mount, load and integrate the results on a server used for the Subscriber’s text mining system and evaluate and interpret the TDM Output for access and use by Authorized Users. The Subscriber shall ensure compliance with Publisher's Usage policies, including security and technical access requirements. Text and data mining may be undertaken on either locally loaded Publisher Content or as mutually agreed.”

Similar sentiments, though in the context of mobile delivery, were expressed by Charlie Rapple later in the conference. Getting quite used to trying to shake audiences out of complacency, Charlie claimed that our users are not happy with us: we have not evolved products and services in line with how they have evolved. New ways of creating, evaluating, curating and distributing information are all around us. We need to win our users back or we will go out of business: if we don’t, someone else will take over!

“Our audience and their needs should direct our strategy.” Charlie Rapple

Critically this means starting with the audience rather than content or devices on which content is delivered. Find out what users need; look into and understand their workflows; deconstruct our content and integrate it with these workflows, making it interactive, relevant and ‘friction-free’.

Kurt Paulus, 2012
Book by Friday 9 August to avoid the late booking fee.

Turpin Distribution – Open Access Services

6th August 2013
PRESS RELEASE - FOR IMMEDIATE RELEASE -

Turpin Distribution is delighted to announce the introduction of its new Open
Access service for managing Article Processing Charges (APCs).

A major vendor in the journal fulfilment market for over 40 years Turpin has
supported its scholarly publishing clients with new services as the journal market
has evolved, embracing new technologies.

The Turpin system can manage all the complex administration surrounding
processing charges and payment collection including; single author; multiple
authors with allocations; third party advanced payment reconciliation; and
copyright status sharing to identify just some of the major servicing capabilities.
All transactional information is reported back to the publisher via the Turpin Web
Reports Portal providing up to date information 24/7.

Lorna Summers, Managing Director of Turpin said “I’m extremely pleased that we
are able to continue supporting our client publishers with new services to meet
the changing demands of the market. We are already working closely with our
clients to provide solutions for the efficient management of Article Processing
Charges (APCs). Turpin looks forward to assisting publishers in controlling their
OA administrative burdens such as billing and reporting so clients can continue to
focus on the creation and dissemination of intellectual property.”

About Turpin Distribution
Turpin Distribution is an International fulfilment and distribution company
providing services to the academic, scholarly and professional publishing
industry. It provides custom solutions for print and digital publishers handling
both books and journals worldwide.

For further information contact:
Neil Castle
Operations Director
neil.castle@turpin-distribution.com
T: +44 (0) 1767 604 868
www.turpin-distribution.com

Friday, 2 August 2013

Kurt Paulus on ALPSP International Conference. Isn't change fun?

This is the first in a series of four reflections on the 2012 ALPSP International Conference by Kurt Paulus, now retired, formerly Operations Director at the Institute of Physics, and long time supporter of ALPSP. Our thanks go to Kurt for capturing the sessions. If this whets your appetite for the 2013 conference, book by Friday 9 August to avoid the late booking fee.

“Isn’t change fun?” Toby Green 

Toby Green, the ALPSP Chair, summarized the three key themes of the conference as “New, Change and Open”. Another formulation might be “Change and how to cope”. In uncertain times it is helpful to have some experienced guides, and Mary Waltham and Mark Ware were volunteered for these roles.

After a wide ranging review of scholarly publishing since ALPSP was formed 40 years ago, Mary pointed to some emerging factors which will strongly influence the next 40, or even the next 10:
  • The geographical distribution of scholarly research and publishing is changing rapidly: in chemistry, China’s output has already overtaken the USA and other ‘developing’  countries are snapping at their heels.
  • Technology has enabled the big deal and allowed the deployment of metrics that have assisted both publishers and customers to make better informed decisions and have facilitated access to information from developing countries. Technical solutions to remaining problems are never far away.
  • “Publishers behaving badly”: publishers have been their own worst enemies in ploughing their own furrow, e.g. over journal pricing, and not taking care to keep their library customers on board by communication and engagement and by not helping them to make a better funding case.
  • There are some underlying factors that have not changed: the continuing need for validation through peer review, the desire to find information quickly and accurately, the specific needs of authors for help with their professional work and for peer recognition often provided by scholarly publishing. In Fred Dylla’s words: authors will still give their first born to get their papers into Nature or Science.

“By 2020, all services will be digital, mobile, customizable, intelligent, interoperable with multiple revenue streams.” Mark Ware


Mark Ware has engineered many surveys of the developing publishing scene for employers and clients over the years and shared his views of some of the economic and technological factors that will drive scholarly publishing in the coming years:
  • We are told that 25% of waking hours are spent on mobile devices: think about it!
  • Understanding your users through ‘extreme analytics’ is crucial.
  • Mergers and acquisitions are becoming smaller and more focused, possibly threatening smaller niche publishers.
  • Aggregation, e.g. of different journals or books and journals leads to more value added products but also a need for  more user aware curation.
  • Sales growth will be based on value added rather than just number of articles in the database.
  • An in-depth understanding of user workflows  will be critical to the ability to add value.
  • Change and competition will force us to move from product centric to service focused behaviour.
  • Open (access, data, standards, platforms) will be the watchword.


Kurt Paulus, 2012
Booking for the 2013 conference is open.