Thursday 27 September 2012

David Sommer: COUNTER - New Measures for Scholarly Publishing

David Sommer is a consultant working with a range of publishing clients to grow products and businesses. He has also been a contributor to the project COUNTER and completed the morning session at the To Measure or Not to Measure - Driving Usage seminar.

He provided an overview of the latest COUNTER Release 4. The main objectives for the update were to provide a single, unified Code covering all e-resources, including journals, databases, books, reference works, multimedia, content, etc. They wanted to improve the database reports and reporting of archive usage. The update will enable the reporting of mobile usage separately, expand the categories of 'Access Denied' covered, improve the application of XML and SUSHI in the design of the usage reports, and collect metadata that facilitates the linking of usage statistics to other datasets such as subscription information.

The main features of the update are:
  • a single integrated database
  • expanded list of definitions including gold OA, Multimedia Full Content Univ, Record View etc
  • improved database report that includes reporting of results clicks and record views in addition to searches (sessions removed)
  • enhancement of the SUSHI (Standardised Usage Statistics Harvesting Initiative) protocol designed to facilitate its implementation by vendors and its use by librarians
  • a requirement that Institutional Identifies, Journal DOI and Book DOI be included in the usage reports, to facilitate not only the management of usage data, but also the linking of usage data to other data relevant to collections of online content
  • requirement that usage of Gold OA articles within journals be reported separately in a new report - Journal Report 1 GOA
  • a requirement that Journal Report 5 must be provided (archive report, broken down by year so you can understand how much you are paying and the journal usage)
  • Modified Database Reports in which the previous requirement to report Session counts has been dropped, and new requirements, to report Record Views and Result Clicks, have been added. (Database Report 3 has also been renamed Platform Report 1)
  • New Multimedia Report 1, which covers the usage of non-textual multimedia resources audio, video and images, reporting number of successful requests for full multimedia content units - optional
  • new optional reports covering usage on mobile devices
  • description of the relative advantages of logfiles and page tags as the basis for counting online usage
  • flexibility in the usage reporting period that allows customers to specify a date range for their usage reports
Sommers posed an interesting question: what is a mobile device? They have used the WURFL list to define. The timetable for implementation includes a deadline date of 31st December.

Sommer then provided a useful background to Usage Factor (UF). It is designed to be a complement to citation-based measures. While Impact Factors, based on citation data, have become generally accepted as a valid measure of the impact and status of scholarly journals and they are widely used by publishers, authors, funding agencies and librarians, there are misgivings about an over reliance on them. The idea is not to try to kill them off, but to provide other measures to use alongside, particularly as Impact Factors don't work so well for non-STM disciplines.

Usage Factor provides a new perspective: a complementary measure that will compensate for the weakness of Impact Factors in sereral important ways:
  • UFs will be available for much larger number of journals
  • coverage of all fields of scholarship that have online journals
  • impact of practitioner-oriented journals better reflected in usage
  • authors welcoming this to build their profile.
Four major groups will benefit: authors (especially practitioner-based fields) without reliable global measures; publishers; librarians; and research funding agencies seeking a wider range of credible consistent quantitative measures of value and impact of output of research that they fund.

The aims and objectives of the project have been to assess whether UF will be statistically meaningful, will be accepted, is robust and credible, and to identify what the organisational and economic model will be. They started in 2007-2008 with market research including 29 face to face interviews from across interest groups as well as 155 librarian and 1400 author web survey responses.

Stage two focused on modelling and analysis and involved relevant bodies, publishers and journals. The recommendations included:
  • UF should be calculated using median rather than the arithmetic mean
  • range of UF should ideally be published for each journal: comprehensive UF plus supplementary factors for selected items
  • UF should be published as integers - no decimal places
  • UF should be published with appropriate confidence levels around the average to guide their interpretation
  • UF should be calculated initially on the basis of a maximum usage time window of 24 months
  • UF is not directly comparable across subject groups and should therefore be published and interpreted only within appropriate subject groupings
  • UF should be calculated using a publication window of two years.
There seems to be no reason why ranked lists of journals by usage factor should not gain acceptance. However, small journals and titles with less than 100 downloads per item are unsuitable candidates for UF as they are likely to be unreliable.

Stage three involves testing. The usage data will be used to investigate the following:
  • effect of using online publication data versus date of first successful request on UF
  • calculation and testing UF for subject fields not covered
  • test further gaming scenarios and assess how these can be detected
  • test stability of UF for low UF journals and confirm level below which it shouldn't be provided.
This will deliver a Code of Practice which will include definitions, methodology for calculation, specifications for reporting and independent auditing as well as a description of the role of the Central Registry for UF and funding model.

David closed with a summary of PIRUS2 whose mission is to develop a global standard to enable recording, reporting and consolidation of online usage statstics for individual journal articles hosted by Institutional Repositories. Further information is available online.

Vanja Merrild: Marketing Channels for the World's Largest Open Access Publisher

The second session at our To Measure or Not to Measure seminar was presented by Vanja Merrild, a digital marketing specialist working with BioMed Central. Now part of SpringerOpen, BioMed Central have 243 open access journals across biology, medicine and chemistry. 52 journals are society affiliated. 121 journals with Impact Factors.

Their network of sites and users have:
  • 32 million page views a month
  • over 5 million registered users
  • over 380,000 recipients to fortnightly BioMed Central newsletter
  • 14,000 new registrants a month
  • Google page rank of 8
Their focus as an open access publisher is on submissions, being author driven, using their own in-house submissions system and focusing on author data.

Vanja focused on providing best practice advice on how to drive traffic and usage to content from their experience. Her suggestions for email best practice are:

  • build your lists
  • have a good welcome programme - what follows after they've signed up
  • make it easy for recipients to add your email to their address book
  • make it easy to sign up
  • maintain consistency in 'From' lines - builds recognition of a trusted source.

She spoke candidly of the action they took recently to boost results for a newsletter that was dipping:

  1. they made it easier to access the interesting relevant content (3 rather than 5 clicks through to website)
  2. noted they had lots of forwards so added a recommend button
  3. captured the forward emails
  4. made it easier to sign up on website and from within the email

Segmentation is hugely important: the same message is not relevant for everyone. Editors, authors, members, librarians, scientific interests: they all need their own message. You should test and measure your hypothesis, check spam filters, look at the mobile device appearance and appearance in different email clients (particularly the ones your audience are using).

Vanja suggested a range of factors to test and measure. Keep track of your reputation with services such as ReturnPath senderscore.org. Understand what works where. Check out your 'sleeping beauties' contacts who need waking up and find out when they leave your email.

She then went on to provide some best practice guidelines for tweeting and Facebook posts:

  • share more about others than you do about yourself
  • promote others and build relationships
  • have a distinct voice
  • images are important.

Your social media strategy should focus on increased ROI for your business and your time. Create safe tests to experiment and don't make this only one person's role. Be iterative: plan, execute, measure, adjust, repeat. Understand which metrics matter and which are your goals. Track metrics before, during, and after to show return on investment and consider benchmarks to better understand what they actually mean. Use analytics not only as a reactive tool to see how you did, but as a proactive tool to hone your branding messages. She closed with suggested tools to measure your activity including: Twitter CounterTweetReach and TweetStats.

Breda Corish: From 20 Million Pieces of Content to a New Clinical Insight Engine: ClinicalKey

The first session at To Measure or Not to Measure: Driving Usage to Content - Marketing, Measurement and Metrics seminar was presented by Breda Corish, Head of Clinical Reference for UK and Northern Europe at Elsevier.

The focus was on publisher products that drive users to content. ClinicalKey is Elsevier's 'clinical insight engine', designed to think like a physician and provide information for diagnosis at point of care, the ability to share relevant answers and a resource to maintain knowledge with.


It took three years development work to develop a product platform to answer questions posed within a clinical care context. The scale of information overload is immense: back in the early 90s, the challenge for doctors was that medical knowledge doubled every 19 years. In 2008 this was down to 18 months. Now the forecast is that by 2020, it will double every 73 days.


This creates the doctor's dilemma: how to access trusted information quickly with a seamless experience. The challenge for Elsevier was how to unite 20 million pieces of content into one seamless experience including existing products such as MD ConsultJournals ConsultProcedures Consult, etc.

They started by understanding users. The 'mechanics' are driven by visual procedural content. Doctors are extremely time-pressed, require pre- and post-procedural care resource, with well-defined, fairly narrow, but deep information requirements.

They focused on understanding the patient care management workflow: from diagnosis to creating a care plan, from medical treatment to after-treatment care plans and patient education and compliance. They also identified collateral workflows for doctors on keeping current, sharing information and not working in isolation, but as part of a multi-disciplinary team.

Using this knowledge they moved from unstructured content to structured content and turned it into smart content and made it work in the clinical setting. They created the Elsevier Merged Medical Taxonomy (EMMeT) using 250,000 core clinical concepts, 1 million+ synonyms, 1 million+ hierarchical relationships and 1million+ ontological relationships.

Through concept mapping they focused on making vast amounts of content easily discoverable using speciality-specific navigation, dynamic clinical summary creation and meaningful related content recommendations. The semantic taxonomy was adapted for the clinical setting and semantic relationships are used to suggest other content (e.g. clinical condition, procedures, etc). Weighted tags are their 'secret sauce for better search'.

The smart content infrastructure is based on four areas:

Product development and enhancement

  • more accurage search results
  • faceted navigation
  • improved content discoverability

Content analytics

  • greater insights into what we publish
  • identification of co-occurring terms
  • link to related external content and data

Personalization

  • individual content recommendations
  • targeted individual marketing
  • contextual advertising

Editorial productivity

  • flexible product types - new collections, image banks, etc.
  • increased speed to market

Usage tracking is based on usage events rather than page views. They have COUNTER-compliant content reports and monthly institution reports based on COUNTER filtering rules. They produce usage reports for different content types e.g. books, full-text articles, FirstConsult, Medline, Guidelines, etc. Every piece of content is tagged so they can produce usage reports. Usage event reports by month include analysis of: discovery (search, browse); content usage; and Advanced Tools usage.

With performance metrics they want to keep the number small for searches per content view as this is key to delivering relevant content quickly. They take insight from the usage reports on what search terms people are searching with and add them back into the product.

A recent Outsell report identified that the 'development of such taxonomies and their use to power the semantic enrichment of collecitons and aggregations of content will increasingly need to become core competencies for publishers further along the digital transition to higher value-added services.' This is something that Elsevier have engaged with directly.


Their plans for the future involve adapting for international markets using same content and powerful functionality, but adding in geographic-specific publications content. They are also looking at developing different interfaces for local languages. Even in cash-strapped health care systems around the world, there is still investment in IT. There is potential for mobile devices and tablets being used in hospitals. Doctors need information when they are on the move and not desk-bound.

They want to do more with content, more with features and functionality for doctors and end users, and develop product further for use any time, any place, any where. They are looking at how they can integrate their clinical or patient records portal or system so it's not disruptive to the experience e.g. developing query buttons to ping off to the other database. Something like 80% of hospitals in the UK are still in early stages of developing these services so there's great potential.

They are currently selling to institutions, but have recently launched a service for individuals - which focuses on their particular clinical specialism, with the option to add on. Overall, their aim is to hit the 'sweet spot' of being at the heart of comprehensive, trusted and speed to answer.

Thursday 20 September 2012

ALPSP Conference Day 2: The scholarly article in 140 characters? Are you a denial-o-saur?

Leon Heward-Mills introduces the session
Digital technology, particularly mobile, is changing the way we access and read information - perhaps even in the way we use different parts of our brains - for problem solving rather than 'deep-reading'. The Scholarly Article in 140 Characters session explored implications of this for consumers, publishers and the research community.

Matt Rampone from HighWire Press kicked off with a summary of mobile trends, data and the impact on STM publishing from their own user data. The information comes from 12 months of data from a wide range of publishers they work with and provides a snapshot of an aggregate study.

Mobile usage as a percentage of total online usage in the US is 10.58% and in the UK is 13.16% at August 2012 and based on smart phones, tablets, eReaders, with iPad 85% of tablet market share. Site use by page views can be broken down as:
  • 82% content
  • 9% home page
  • 4% search
  • 3% current issue
  • 2% ahead of print.
The visits by operating system are split by 69% by iOS, 30% by Android, and 1% by BlackBerry. Mobile website usage trends are going up. People are finding the sites and coming back. Where smaller publishers - who publish once a month - have mobile apps, they found that users were coming back showing real engagement with a mobile platform. The median time for mobile website browsing is 90 seconds.

People spend more time reading with a tablet. With a smart phone they spend less time reading, perhaps because it is not as comfortable a reading environment? PDFs are more lately starting to trend downwards and - unsurprisingly - they are non-existent on the smart phone.

Matt's key recommendations:
  1. invest in mobile (mobile optimised websites first)
  2. invest in analytics
  3. focus on creating good experiences: more tools and better end experience.

Charlie Rapple from TBI Communications - who wins the prize for best slide of the conference - provided a wake-up call for those who won't rethink their content strategy in light of new digital technologies. Content still tends to be long form, text based and not very dynamic. As an industry we've got into the mindset of “if it ain’t broke, don’t fix it”.

Slide of the conference: denial-o-saurs
But are users happy? What they want is changing, but we’re not really recognising and responding to this. One reason is that we haven’t evolved our product or service to the extent to which they have evolved. The risk is that we will lose market share to disruptor players. Does that mean that as an industry we are denial-o-saurs?

We need to consider how long we can win the battle of mind share. We need to win our users back and get better at meeting user needs by reconnecting, re-engaging, and rebuilding relationships with readers and stakeholders. The transition to online wasn't a revolution, but mobile and social may well be. So let's go back to basic principles - the audience and their need - to direct strategy and rethink core value proposition. We need to make better use of mobile technology.

Howard Rheingold in his book Smart Mobs claimed that mobile technology is “not just a way to do old things while moving... a way to do things that couldn’t be done before.”

We should think about what we can gain from deconstructing the article in terms of information and workflow. There is potential to meet reader/customer needs in the right place, at the right time with interactive, relevant and friction-free content. Charlie has a simple equation to work to:

right information + right place + right time = value

What she wants - and is starting - to see is user experience being less passive with a better sense of how it is contributing to their work and therefore the value. Mobile is really contributing to this. For publishers, the key is to get close enough to users to understand and get to know audiences. It doesn't have to be difficult or expensive: you can do desk research, analyse data - you don't need a huge budget. Having a crack in-house is better than not doing anything. An initial bit of work will help you really articulate the questions you need answers to.

You can combine general information from desk research with more focused research you do yourself. Take the general information (e.g. smart phone take-up) with own (user analysis of device type) and also ask really specific questions (e.g. problems that preclude smart phone use: security, confidentiality, connectivity?) Then compare the reality of what is happening with your site and your users to compare perception versus actual behaviour. Consider user observation and it is useful to probe and explore in more detail initial findings of your survey.

Citing an example from one of TBI's recent clients, one interesting finding from this approach was that as much use of professional information happens outside work while commuting or at home. This had implications for the structure and presentation of content with unexpected insights and inspiration. For example, expect the previously unexpected: medics can now use mobiles in the air in the US. This resulted in a medical app being used to save a life on a plane.

Nick Bilton in his latest book I Live in the Future & Here’s How it Works suggests we need to make time and space to make innovation happen. Another contemporary theorist, Steven Johnson talks about the “collision of slow hunches”. It's about focusing your efforts, setting and prioritising objectives, as well as embracing and planning for change. Move away from thinking about content to thinking about customers and it will have an impact on every department.

Rapple's vision comprises a roadmap:
  1. audience research and segmentation
  2. brainstorm problems and solutions
  3. prioritise audiences and objectives
  4. prepare staff and processes for change.
Despite having to follow the awesome denial-o-saurs, Tom Reding from BBC Worldwide charmed and informed the audience in equal measure with multi-media highlights from key BBC brands (Top Gear and Doctor Who anyone?) as well as some fabulous merchandise giveaways. His was a view from outside of the scholarly publishing industry, one that could provide fresh insight or some new ideas.

BBC Worldwide drives revenue to go back into creative sector and BBC.com has over 50 million users a month. Digital is important because it’s about the money. New revenue streams are driven through new digital channels, new production and cost savings. Digital also opens up new markets. It gives them a closer relationship with audience and engages fans across platforms. But one of the biggest reasons they focus on digital is to stay relevant.

Their teenage audience watch their content exclusively online so they need to unlock that content and seek new value. Areas to consider are:
  • the business of free
  • getting closer to your users
  • new partnerships
  • gamification and serious games (new trend)
In one of the more interesting comparisons of the day Redin observed that the cross-platform chunked up and down content for Top Gear isn't too disimilar to what you can do with an article. He advised us to step away from our content. We’re in the business of insight NOT journals. Remember that mobile users are more willing to pay for services than desktop users as they value convenience and immediacy.

A move to digital means more than re-versioning. Sweat your assets, distribute as far as you can, but get to the core of your offer which is insight. Explore new media platforms. Think about the business of free: with marginal cost of production and market factors pushing to ‘free’ can you adopt the free + premium = freemium model? If you scrutinise your sales, you’ll likely find you make your money from 10-20% of your audience. For BBC Worldwide, the rest get it free. It's a classic Pareto Principle model. So consider if freemium will work for you.

He suggested that with a large adddressable audience (250k knowledge workers) and a free version with clear value (the abstract), a premium offer with added value using user analytics (build or partner) and an optimised conversion funnel (build or partner) can demonstrate your confidence in your product.

If 13% of paying users are providing 51% of revenue, create products that cater to this audience and allows them to spend their money. Closing insights included:

  • dig deep beneath the Google analytics as they are a free and powerful way to inform your marketing budget
  • measure results and iterate
  • try things out e.g. work with companies such as Semantico, Mendeley etc
  • content still remains king - make the best content and you will get the best users
  • look at gamification: take a gaming approach and apply it to none-game situations e.g. FoldIt - Solve Puzzles for Science.

Redin closed with a charming illustration of how gamification and playfulness can inspire mission-type behaviour. The Bottle Bank Arcade form The Fun Theory. It's pretty inspiring stuff.

Tuesday 18 September 2012

ALPSP Conference Day 2: Apportunity Knocks - advice from Semantico's Rob Virkar-Yates

Rob Virkar-Yates shares Semantico wisdom
Rob Virkar-Yates provided advice on mobile digital strategy from an industry supplier perspective for delegates at last week's ALPSP conference.

He questioned whether apps are the VHS or Betamax to the academic world and asked whether apps represent an opportunity for the scholarly market and can you make any money out of them? He presented Semantico statistics to illustrate some of the issues to consider:
  • 68% of 15-24 year olds (UK) use the internet on mobile phones
  • 11% of UK households have a tablet
  • Younger audience are heavy users of mobile
  • Only 5% of sessions served in 12 month period for some big publishers has been from mobile site
  • 50% of people have downloaded an application and actually used it.
The type of apps that tend to be most successful are those that deliver micro-experiences. These are experiences that do not overwhelm or perplex the customer, are relevant, small and beautifully formed. He cited good micro-experiences as: Tools of Change app, his bank's fast balance app and Spotify (amongst others) as they are focused, they do one thing, but they do it well. With the arrival of the iPad comes the macro-experience: that adds significant value and richness and provides experiences that go beyond the text. Good macro-experiences included: The Wasteland app.

He went on to define the recipe for a good micro-experience as:
  1. take on sharpened proposition - do one thing really well, fight one battle
  2. reduce content aggressivly - content should be be ‘glanceable’
  3. think small - architect from the ground up
  4. push back - resist pressure to add more
  5. be agile - not software, move fast, iterate.
Consider content that is location specific or for quick reference. Think about discovery and bookmarking, cost and value. Understand your options and don't forget the cost of reach through an app or via a mobile website. If you already have a mobile site what can you do? Make an ‘app’ with a responsive design solution: a web app! This can serve a single source of content and be laid out so as to be easy to read and navigate with a minimum of resizing, panning and scrolling, on any device. When you get it right, you get a really nice, simple app site which is like a micro-experience app. While this can be seen as a defensive move, the pros are that it is low risk and high reach, has comprehensive browser support, is future proof and has a single code base. The cons: it's not a specific content set, it front loads process, is network dependent and isn't an ‘app’!

Is there an opportunity? Probably. But micro- or macro-experiences will depend on the nature of your content. Can you make money? The low cost and potential reach of web apps might be the best option for the majority.

Monday 17 September 2012

ALPSP Conference Day 2: Apportunity Knocks - Will Russell with the RSC Publishing view

Will Russell showcases ChemGoggles

In Wednesday’s parallel session Apportunity Knocks, Will Russell, Manager of Innovation and Technical Development at RSC Publishing provided an overview of their approach to develop mobile apps for their members. They started their mobile timeline in 2010 with Chemistry World (mark 1) following in the same year with Publishing Platform and ChemSpider mobile sites. In 2011 they launched RSCMobile, ChemSpider and Chemistry World (mark 2) mobile apps.

They believe the future is multi-device so their aim is to be device independent. The mobile market is large and still growing with app usage time exceeding mobile website usage. In this landscape, browser-less discovery is key.

They aim to keep customer needs at the heart of what they do.
Chemistry World now has a full mobile app to respond to reader demand. They have optimised content for smaller handsets, but it is more like reading the magazine on a tablet. ChemSpider 2010 had a mobile optimised website but has now moved to the ChemSpider Mobile App. Why? Because of the mobility of scientists. Whether in a lab or at a seminar - you can put content in their hands. They have received great anecdotal feedback at conference: people speaking positively about ChemSpider because of the app.

They keep the strategy and products under constant review. Developments to consider include APIs that will generate the most useful apps with your/our content and the use of information available free of charge to create apps such as UK Government Data.

Russell reflected on some of the challenges they have faced:

  • authentication
  • number of new devices and variety of functionality set on each one
  • testing on devices (availability)
  • supporting devices and staff training
  • app store approval
  • on-going future maintenance - more technologies than just a hosted website.

He concluded with some advice. Make your approach to apps part of your digital strategy - not separate. Think about what it offers over a mobile website. Release on one platform first and limit the initial approach on delivery: wait for customer information to help you prioritise what is being asked for and what works. Keep a close eye on KPIs, ROI and analytics. Understand that separate mobile websites may not be scalable. Think about how many devices and screen sizes are you going to need to support in the future. Other issues to consider include:

  • for usage: it’s not as simple as reproducing the website 
  • how can you support others to build apps?
  • impact of support for app development
  • traditional channels still serve the highest usage
  • we will never have such a long period with the same interface again - in two years we could be talking glasses in the mainstream.

ALPSP Conference Day 2: Changing the game - Arend Welmers

Arend Welmers challenges way organisations are run

Day two of last week’s ALPSP conference saw a new departure for the programme with a session focusing on how to radically improve the way your organization operates. Speaker Arend Welmers specializes in helping organizations execute their strategy.

He provided an outline of the Open Book Management and Quantum90 approach to use the power of people to rapidly transform organization performance.  Here are the highlights from his talk – essential for anyone considering organizational change or transformation.

Most people when asked what their role is in an organization will probably describe it in a way that is something to do with their role or linked to your job description. This is what is called functional thinking and is a disease in most organizations. So marketing = campaigns; finance = count money; technical = build wonderful technical platforms. In fact, this is only partly right.

There’s a second job most people are also – or should be – doing: a role in helping the organization to make more money, to grow and be more profitable.

Whether you are not for profit or for profit or mission driven, you are competing in a market place. There are people out there trying to come up with better ways of doing what you do. You are responsible for the fees your organization generates. You can’t just participate; you need to compete in your market. And don’t be under any illusions: there is a market for the area you are working in.

Everyone in an organization has two roles: functional expertise and the competitive inclination of the whole team. This is about participation in the business of business to make the company more successful.

Most organizations start with somebody who has a great idea. They bring people together and provide the drive to work on a passion. This excitement, the energy, focus and purpose is what helps the company start to perform.

As an organization starts to be successful, the founders that have the passion need to recruit specialists to help them develop. But they are not about the company; they are focused and passionate on what they do i.e. setting up a great appraisal system. They take a functional approach.

How often do founders sit down with new recruits to tell them why you should join and help them make money or make the organization successful? It doesn't happen. You need to get everyone in organization thinking along these lines from whenever they start. If not, the growth curve will flatten out.

Many businesses are on the point of going downhill. What happens at the bottom? There are drives for employee engagement: and while the intention is right, the results tend to be poor. Nobody will take accountability where they don’t own what they are acting on.

Do you want all employees acting, thinking and feeling like they own the business? Isn't that the sort of company you’d invest in? Why don’t all businesses run like that?

Often, people come into an organization and want to do a good job, but can’t help change or influence because they are not allowed to: management don’t know how to tap into the innate need to win. Open Book Management is an approach that creates a high involvement culture in the business.

How do we create an organization that performs in this way?

1) Identify weaknesses or opportunities of running the organization to get everybody participating. This is a great way to get those who know a function or job well to be able to work out the challenge and figure out how to fix it. People get really creative if you give them the opportunity.

Tip: Set a seven day challenge. Generate huge excitement that will generate change. Quantum90 challenge is also a model to consider.

2) Build a high involvement culture in the business. This takes a little longer. It starts with sharing much more information in your business with everyone. Share information on how the business is performing and make sure the team understand what it means.

3) Consolidate this work by giving the team the skills, knowledge and information to empower them so they make better decisions. This in turn makes it easy to manage them and make change.

If you then have the courage to entertain notion of spreading the wealth that you create among the people who generate that wealth you can move from participation to engagement to a terrific outcome. It only seems fair that the outcome should be shared.

Further thinking on this approach is available here.

Friday 14 September 2012

ALPSP Conference Day 2: Apportunity knocks? Mobile as part of your digital strategy

Apportunity Knocks was a session focusing on apps as part of a mobile development strategy for publishers. This is a summary of Mark Ware from Outsell's  data insight into mobile traffic and usage.

Tablet sales exceed smartphones and PC sales are dropping off. Mobile traffic is doubling year on year because of i) adoption of devices and ii) improvement of mobile networks. The rise of the fourth screen is here. A quarter of devices in workplace are now tablets or mobile devices.

Ware mentioned some interesting industry developments and useful articles including:

Case studyEpocrates is their point-of-care drug information and identification of pills app. It has built up >1million users with half of all US doctors as users. They have a 'freemium' version supported by advertising. Most users use it for 30 seconds at a time on average five times a day.

Mobile can be used for discovery and as a sales channel. There is renewed interest in individual purchasers. But should we consider a cynic's view of mobile apps? Are they the CD-Roms of mobile? And what about app versus app? The FT removed the iOS app as they didn’t want to lose a third of revenue and the marketing controlled by Apple.

He summarised the evolution of apps as:
  • looking up
  • keeping up
  • long-form reading (tablet)
  • self-study, CME, educational
  • swivel apps (sharing information, e.g. doctors share information with patient to help them influence their own outcomes)
  • interacting (with research information) still in infancy for annotation, tag, add abstractions, but only if synched through the cloud
  • and true workflow integration for STM, which is still to come.
What do users want? And do they know? Is it mobility, social media computing, cloud storage or multiple use? It has been a commonly held view for several years that knowledge workers are overloaded. The top obstacle to getting information in 2008 was not enough budget. In 2011 it was not enough time.

The key challenges for STM publishers include re-imaging content for use case versus repackaging products in digital facsimiles and supporting widespread adoption in institution and hospitals. Business models to consider are: adding value to subscriptions; improving customer experience; advertising and sponsorship; 'freemium' model; individual tablet subscriptions; paid for apps and ebooks. He closed with a list of essential actions for STM publishers:
  • mobile development is a core competence 
  • adopt multi-channel content systems
  • mobile optimisation is critical
  • experiment with business models
  • manage the costs.


Thursday 13 September 2012

ALPSP Conference Day 1: News of the World - Science Journalism Demystified

Sian Harris: science journalism insight 

Sian Harris, editor of Research Information, provided an introduction to the work of - and case for - science journalists.

Specialist journalists act as a bridge between research and readers. They aid public understanding and help researchers to deal with information overload. But there are challenges with the flow of information. Expert journalists play an essential role in decoding science and finding out what it means for everyone. That’s not to say that it can’t go wrong (e.g. MMR vaccination controversy), but there are good examples including extensive coverage of the Higgs Boson experiments.

What kind of people are science and technology journalists? They generally have a science or engineering background. They are pedantic about grammar and scientific accuracy. They work to tight deadlines. There are some differences between journalists in the mainstream press and specialist press.

What are their sources? Press releases are a common primary starting point. These might be from direct from journal or publisher PR and marketing departments, from research institutions, companies or distributed by PR news services.

Scientific papers are often the source of stories and these can be used in two main ways: 1) for a news story about a new piece of research, or 2) source of background and researcher contacts for a bigger article about a research area.

Other sources include conferences – a brilliant resource for a whole range of material and contacts - personal communication, patents, corporate news or other journalist articles.

There are limitations with press releases. They may be subject to spin or a particular bias. Often, they are not comprehensive enough and there can be a reluctance to share further detail behind the story.

Another limitation can be access: where access to a particular journal’s content is not available. An Athens type login would be ideal to get beyond seeing only the abstract of research papers. Unfortunately, this can inadvertently bring in a level of bias towards research that is freely available.

Discoverability can also be a major barrier to accessing research for articles. Citation databases behind paywalls present a challenge and while free search engines can be a great source, you can miss things or bring up spurious results.

To a certain extent there is a geographical bias: it is easiest to find stories from the USA, UK, rest of Europe (especially Germany), and other English-speaking countries.

There are limitations with researcher communication skills.  They may be inexperienced with dealing with journalists, not reply to emails or calls or may not be able to explain clearly.

Sometimes journalists struggle with media embargoes: they are fine when they give you time to interview and prepare the piece, but they are rarely relevant. Often it should be a case of either release the information or don’t bother.

It was interesting to hear that the challenge of information overload applies to journalists as much as researchers, publishers or librarians.  While Sian acknowledged that news journalists use Twitter and other social media channels widely, the journalists that Sian spoke to didn’t, and most felt that perhaps they should use it more. They felt that it wasn’t the most efficient use of time, but acknowledge that it has provided some great leads.

Sian’s suggestions for publishers when dealing with journalists are:

  • distribute more releases
  • send to all who expressed interest
  • make it clear who to contact
  • make headline clear in subject
  • enable easier access
  • simple discovery
  • make it possible to search corporate info on publishers website
  • provide images without lots of red tape.

Her suggestions for universities and researchers are:

  • provide training in communication
  • don’t be frightened of journalists
  • feel free to ask if you can check facts and quotes (but bear in mind they may say no due to deadlines)
  • use opportunity to talk at conferences.

She believes that there is a need to work together and to have better communication and understanding. If the red tape around press access can be broken down and some sort of authenticated content and access tool for accredited journalists can be developed, it will lead to better coverage for research.

ALPSP Conference Day 3: Giving away the farm


'Giving away the farm' was the penultimate session from this morning's programme. Chaired by Catherine Candea from OECD, the panel reflect on the pressure scholarly publishers are under to give away content. 

With six month embargoes being demanded and some publishers moving fully to giving away their output as soon as it’s published, what effect is this going to have on the industry?

Wim van der Stelt from Springer provided an relatively upbeat assessment of where they are at. It’s good that the Finch group recommended the gold OA route and acknowledged that content needs to be paid for. Meanwhile, Europe, as always, took the wrong decision and have come down on the side of green with a 6 months embargo or gold, but with the funding still insecure. 

Springer pioneered the hybrid journal and now have price adjustments for selected hybrid journals every year. They moved into real/full open access when they bought BioMed Central in 2008. Not only did it provide them with an open access portfolio, but also helped their systems and processes to be author oriented. Most importantly, they bought in the new culture enabling them to shift from legacy culture. Oh, and it also makes money.

Springer Open now has over 405 members in 46 countries. Their Open Access journals are in all area including economics, social sciences, humanities, etc and 15% of all their articles are already open access. Springer Open now includes books! 

His advice to publishers on what they can do included:
  • provide hybrid option
  • start full open access
  • transfer subscription journals to open access
  • look at sister journals
  • sponsored journals
  • new journals
  • be creative with financial models for society journals
But don’t forget... open access is just a business model so keep on providing the service of validation, structuring and dissemination and don’t forget to develop new services.

Jose de Buerba from the World Bank took us through their transition from charging for publications to making everything free. They are the largest Development Agency worldwide with $35.2 billions in projects approved during FY12. With 10,000+ staff and 120 field offices (mostly in developing countries) their annual publishing outputs in 2011 were: 367 journal articles (75% published externally), 438 working papers, 146 books, 500+ other pieces of analytical work very diverse subjects.

The World Bank mission is to help generate a world free of poverty, but their conflict was between: mission or money. They have embraced green OA so they can be open about what they know (data and knowledge), what they do (operations and results), about how they work (partnerships for openness), and have open government (transparency, accountability).

This video puts it neatly:


The new currency is for development impact and their Open Knowledge Repository has had 140,000 downloads since September. The business model comprises: budget from the institution, publishing service fee, revenues from commercial activity from value added products.

Xavier Cazin from Immateriel.fr considered paid content vs paid comfort. They are one of the four ebook distributors in France. Distribution is still pertinent for their ebook market and it involves seeing all kinds of publishing offers based on the same content.

The certainties of print costs have gone and with digital content it has become difficult to understand what people are expecting on pricing. One trend they’ve seen as a distributor is that of people making a business out of refurbishing public domain content. Why does it sell? And why at that price?

He considered what customers are willing to pay for: 
  • ease of access (where they are now)
  • handy (readable with my current device/app)
  • easy navigation through the content
And what are the reading rewards? A rich and well designed environment, content/curation appropriate to current needs and sharing functionalities. In his view, the product unit price should work to equation of ease of access + reading rewards = value. For the reader, content quality + authors reputation = content reputation. Based on this, the total revenue should be: product unit price x number of customers/readers with the former related to reading comfort and the latter to content reputation.

For anyone not clear on definitions of OA there’s an overview on wikipedia http://en.wikipedia.org/wiki/Open_access

Wednesday 12 September 2012

ALPSP Conference Day 2: Discovering the needle in a haystack

Ann Lawson introduces the panel

Chaired by Ann Lawson from EBSCO, this session is designed to help publishers understand how they can help academics and professionals to navigate quickly and seamlessly to the trustworthy content they need.

Ann's colleague Harry Kaplanian, Director of Discovery Services at EBSCO Publishing, kicked off with an overview of discovery services as well as the features and benefits for the publishers. 

He began by reminding us of the first discovery system is a library catalogue system in the early 1900s. He then went on to outline the pros and cons of subsequent systems. 

Pros: users can search the entire physical collection quickly; tight ILS integration; one place to search. 
Cons: users can only search catalogue; metadata searching only.

In the 1990s, the first electronic databases began to appear. They aren’t part of the physical collection; change often; multiple tools needed for searching content; students and faculty no longer know where to look; and the e-content just keeps on coming...

Federated search
Pros: single search box for all content; currency of content.
Cons: speed; many indexes; multiple ranking algorithms; larger result sets in complete; internet traffic; content provider traffic.

Web scale discovery
Pros: single search box; search all content, single index and complete result sets
single relevance ranking; speed and bandwidth; no local hardware or software to install; eliminates traditional list problems; drive usage and lower traffic.
Cons: not tightly integrated with ILS.

Usage impact
5000 students
09/10 to 10/11 205% increase in usage in text

He classifies content providers as:
  • Primary publishers
  • Aggregators 
  • Subject Index Providers
  • What do they need to do?
  • What should they watch out for?
Primary Provides - journals & books:
Now standard to provide full text and metadata to discovery services for searching. In most cases content is presented by publishers. The user is guided to full text by link resolver. You have to make sure highest quality metadata and content provided and that active updates to databases are provided.

Aggregators - full journal databases
Most don’t have the right to submit the full text so content presented by aggregator or publishers. The user is guided to full text by link resolver. Make sure highest quality metadata and content is provided, that discovery vendor accurately states if aggregator is actively taking part or not, and active updates to databases are provided

Subject index providers
Powerful subject indexing based on controlled vocabularies, no full text, but a big impact on full Discovery. Make sure the Discovery service is capable of properly searching, merging, ranking and securing entitled access. Consider what happens when a customer cancels subject index subscription, renews, adds it, doesn’t have it?
Make sure Discovery vendor accurately states if subject index provider is actively taking part or not - check the vendor’s claims.


Simon Inger provided an overview of key findings from the Survey on Reader Navigation which is due to be published shortly. Read about the project here

Main recommendations for publishers are:
  • Publishers need to support all of the discovery channels that their clients (libraries and readers) want to use
  • Publishers need to understand how different reader types discover and access their content so that they can target readers and authors more effectively
  • Potential to expose more sophisticated discovery information to key channels
  • Potential to differentiate through which discovery channels to make subscriber offers.
The summary report will be published for free later this month. The full report will be published at the same time. All supporting publishers receive these items in return for their help. Full results set and analysis framework will be available for a fee shortly afterwards. 


Robert Faber, Director, Discoverability Program at OUP, concluded with an overview of the Introducing the Oxford Index.


Why does discoverability matter to publishers and librarians? Traffic and use are the lifeblood of digital scholarship. use of subscriptions shows the value of the content. Discovery reveals interest and demand for new content. Customer and user behaviour is changing. If you can’t find it, you won’t use it. People are searching for a topic, not book - 80% of traffic to Oxford Journals is direct to article. There are many search systems and rapid evolution. It's about free content outside the paywall for some products - abstracts, keywords, Oxford Journals, Oxford Scholarship.
The MARC program has been improved and expanded. Linking: some in place, mainly in close neighbourhood. Some editorial linking between products. Some partnerships with other publishers and institutions have been set up. New mobiles sites for Oxford Journals and future products are set up. Library discovery services have been developing new partnerships. SEO is at the centre of product evolution.


Discovery happens in Open Web Search, library services, research hubs, through content links, opt-in services, viral awareness and via OUP web features.

What is the Oxford Index? It’s free discovery from OUP: a standardised description of every item of content, in one place. It incorporates external search partners, an Oxford interface - landing pages: quick pathways to full text, web-searchable; and cross searchable - but they recognise that the website might not continue to be very important. It provides a way to create links and relationships across content with meaningful links that add value and traffic. There are overview pages for quick view of topic links embedded in products. This is a free service integrated with existing products.

What does this mean for search rankings and usage?
  • No change to existing SEO or rankings
  • OI is supplemental route to primary full-text content
  • OI gives Google a super-site map across OUP content
  • Highly-trusted network reinforces destination full-text sites
  • Aim is additional traffic, monitored and reported

What does it mean for library integration? Library visibility? Bring in users from general web search. OI can interact directly with library search. OI identifies library’s provision of full content which highlights the benefits of library services. Visibility through other library, A&I, research services? OI metadata routinely supplied to library systems.

The benefits include i) traffic: sustaining and widening sales, ii) consistent methods - for users and systems, and iii) evolving grid of options to connect content.

Faber finished with trends and predictions that included:
  • importance of india, china and the non-western world
  • differences between journals, book and reference content defined by role/task within the research journey
  • sites that can qualify general users will become a larger focal point of discoverability activity e.g Google Books
  • shift from focus on entrance-point to linking: related content, related services
  • scholarly/research communities play bigger role in academic discoverability.

ALPSP Conference Day 1: Global Consortia and Library Markets overview

News of the World panel

Thomas Taylor from Dragonfly Information Services provided a packed overview of global consortia and library markets as part of the News of the World session on day one of the conference.

He observed that access to research is becoming a ‘right’. Open Access movement is a disruptive business model and there is an evolution of traditional subscription models for consortia. 

The advantages for publishers include protection for subscription revenue, access and usage explodes, the impact factor goes up and you can integrate Open Access journals into consortia deals.

Global trends in consortia markets include:
  • Library budget constraints in all economies – from dire (in Ireland) to selective (China)
  • Large national closed consortia with less/no central funding (DFG, NSTL, CRKN, CAPES)
  • Mature (penetrated) consortia acquiring less
  • Budgets plus time constraints (renewals) (CRKN, China, JISC)
  • Newer (less penetrated) consortia proceeding cautiously and in increments (Mexico, Japan)
  • Renewals still strong, the default setting
  • Consortia as purchasing model still healthy


China
Has a strong economy showing signs of slowing down. There is a commitment to invest in education in short and long term and to indigenous publishing (including journals). Local marketing, sales and brand are a necessity for language, laws, social media (they have their own versions of twitter and google). Consortia had aggressive purchasing until two years ago, but are now more selective and subject area specific. They are showing sign of mature consortia and purchasing has been slowed down by politics and bureaucracy. There are two very well know consortia: NSTL (closed, centrally funded, 600 institutions with 9 member board) and CALIS (open with no central funding, 700 to 1400 institutions).

India
They have a strong economy with no signs of slowing down as well as the third largest HE system in the world. Major investment in recent years appears to be continuing in both new institutions, R&D spending and new consortia. With 10 to 15 major consortia this is still an immature market.

Middle East/North Africa
This is a diverse region with many markets. There has been an economic slow down since 2009. They are thinking about how to invest in post-oil economy, education is one of the key factors. The Arab Spring has created political uncertainty and instability in some countries (but hope as well!). There are immature consortia who are increasing spending on e-journals, databases and ebooks, especially in STM. It is anticipated that 2013-2015 library budgets will be flat or increasing modest.

Europe
Economies continue to be challenged and uncertain. Europe is a diverse market (north vs south?). Library budgets are decreasing or remain flat. Very little new content is being purchased. Renewals of deals still the default in most cases. Main policy impact is through Open Access mandate from EU in support of UK mandates.

UK
The economy is in recession. JISC is one of the oldest and mot mature consortia (Open). There is little acquisition of new content (ISPG purchased this year). Renewals of deals is the default (two big deals renewed last year amid much noise). Government mandated Open Access most aggressive in the world.

Brazil
Strong economy (6th largest in the world, overtaking Britain). CAPES a national, closed, mature consortium, is one of the oldest. It has deals with most commercial publishers. There are 350 libraries (all of Brazil academic market and then some). The new government has created an uncertain future and there has been very little new content acquisition in last two years.

Mexico
This is a growing economy with an unstable political situation with new government and drug wars. Mexico has a three year old government funded consortium.

Canada
The economy is growing. CRKN is a mature national closed consortium that no longer has government funding, but is funded by member libraries now. Budgets for new content are uncertain. Renewals only in 2012 for 2013. There are regional consortia (four, open) buying some new content, but budgets will continue to be uncertain until CRKN renewals are confirmed.

United States
Elections in US could effect consortia in a big way. If Obama is re-elected he has promised to invest in education and research. EBSCO Community Plus is a new open North American consortium. There are 250+ research libraries and 50 ARL libraries. ISPG and several other content providers are participating with tentative plans to expand into Europe.

In summary:
  • Consortia deals continue to be healthy and renewed
  • Most consortia adding little new content when compared to the past
  • Centrally funded consortia are challenged
  • Addendum scientific research is increasingly global
  • Implications for libraries, consortia, publishers and purchasing models
  • Addendum 2: Open Access mandates and how they affect the consortia world.
  • If you are’t a large commercial publishers, join forces with other like minded publishers.

ALPSP Conference Day 1: News of the World


Adrian Stanley, CEO of The Charlesworth Group chaired the last session of day one of the ALPSP International Conference titled ‘News of the World’.

It was billed as ‘a satirical and informative overview of important and dramatic global, technological and social news impacting the future of professional and scholarly publishing’. Adrian was joined by Sian Harris, Editor of Research Information, Mark Ware from Outsell and Thomas N Taylor from Dragonfly Sales and Marketing Consultancy

The session was jam packed with statistics, facts, advice and recommendations. Highlights from Mark Ware's session are below.

Mark Ware presented data from Outsell on the sector:
  • Information industry $462 billion 2011 roughly  same GDP of Malaysia
  • STM is nearly $30billion  total STM information
  • STM much less cyclical so affected much less than rest of information industry by economic slowdown
  • Core library market very flat but some bright spots including emerging markets, mobile, ebooks, databases and tools
  • World’s biggest market still the US, but not for much longer
  • Largest growing market is Asia
  • Transition to fully digital revenues


10 trends that are shaping the information industry:
  1. mobile social and rise of GAFA (Google, Amazon, Facebook, Apple)  - 25% spent on mobile devices
  2. big data and extreme analytics
  3. focused scale – companies that are dominant and compete on scale but do so with deep emphasis on a topical area of areas (e.g. McGraw Hill, Pearson, Gartner, Thomson Reuters, etc) – bump up against their own clients and suppliers. More competition is becoming the norm.
  4. M&A, value chains and convergence
  5. Aggregation’s persistent call
  6. Fun, gamification, philanthropy
  7. Human assistance – service and culture – the more digital our world will become, the more important human experience matters.
  8. Sales 2.0
  9. Open vs closed
  10. New interfaces


We are moving from a product-centric to service-focused approach.  Publishers will become more like platform providers. Publisher strategies should include:
  • An in-depth understanding of user workflows (critical to adding value)
  • Building a mobile strategy
  • Simplifying users’ lives
  • Getting closer to end-users
  • Re-aligning the sales and marketing organization
  • Avoid disrupting their own markets


Mark's final insights for the future in 2020 include:
  • All services are digital, mobile, customizable, 
  • High in-built intelligence factors
  • Relevant at point of reference/care/decision
  • Multiple revenue sources
  • Interoperable