Monday 15 October 2012

IOP Publishing announces the launch of its ebook programme and a new partnership


IOP Publishing (IOP) has today announced its entrance into the book market as part of a long-term strategy to expand its support to the research community by offering the broadest range of publishing services to authors and readers.

The IOP ebook programme will provide an additional content channel for authors looking to publish with a society publisher. The programme has been designed to meet the demands of today’s research community by delivering authoritative, high-quality books in physics.

The new portfolio will cover topics across the physical sciences and will support the needs of physicists as well as researchers from other disciplines who are working in interdisciplinary areas of research.

IOP has had a long history in books publishing and was one of the first publishers to experiment with ebooks. The company sold the books division in 2005 to focus on its journals portfolio; however, since then the landscape and demand for ebooks has changed dramatically. IOP is now excellently positioned to serve the needs of the community.
 
This innovative e-first book programme is built with the future in mind. As born-digital content, free of legacy issues, it is designed to meet the needs of the research community in a changing publishing environment.

IOP is also delighted to announce a strategic partnership with Morgan and Claypool Publishers (M&C) to build a dedicated collection as part of the overall ebook program. M&C have combined an innovative editorial approach with an e-first model which has made them one of the leading innovators in high-quality book publishing.

M&C’s publishing model – which already serves the engineering and computer science communities with its Synthesis Digital Library, and the life sciences community with its Colloquium Digital Library – will be adopted to create a new physics collection.

M&C’s unique publishing model emphasizes the rapid digital publication of short books on key research topics that can be incrementally revised and expanded over time.  The model makes it possible to publish tutorial works on active research areas much earlier than with traditional print book models.  Its books provide access to advanced research that is much more accessible than the original research literature and provides extensive linked bibliographies for readers who wish to go further.

Olaf Ernst, Commercial Director of IOP Publishing said: “It is a natural fit for IOP, through its close relationship with the physics community, to launch a high-quality ebook programme that will complement its existing journals portfolio. We have a shared vision with M&C to serve the research community by delivering high-quality content through an innovative publishing approach. This will form the basis of a strong partnership and I very much look forward to working together on this.”

Michael Morgan, President and CEO of Morgan and Claypool, said: “We are very enthusiastic to collaborate with IOP and to benefit from its deep knowledge and commitment to the physics discipline in jointly developing a resource that we both believe will be of significant value.”

Commissioning content will begin immediately.

ENDS

Notes to Editors

For further information, contact Karen Watts, Public Relations Manager at IOP Publishing on +44 (0)117 930 1110 or e-mail karen.watts@iop.org.


Tuesday 9 October 2012

Tools of Change Frankfurt: Mission Publishing

After a brief spell setting up the ALPSP stand in Hall 4.2 over lunch, I popped into the Publishing Innovators session back at Tools of Change chaired by Sophie Rochester from The Literary Platform.

This last session in this stream for the day was an interesting dip in the water of 'Mission Publishing'. Here's a brief summary of some of the projects that were showcased. They may - or may not - provide some inspiration on how to drive engagement and get funding.

Jesse Potash from Pubslush described this global crowdsourcing platform for books. They believe that crowdsourcing or funding is ideal for books because publishing when compared with other creatgive industries has:

  • Cost - low barrier to cost as compared to films, for example
  • Components - film or music album has so many more components - credits are long - books are shorter
  • Skill - at the end of the day NYT 85% of Americans think they can (and will) write a book in their lifetime (whether or not it's good is enough matter)
  • Product - not what you expected (e.g. let down by X Factor winner's album so don't buy it - with a book you can read a few pages - try before you buy.)


His advice for effective crowdsourcing/funding is to focus on four things:
  1. fundraising
  2. market analytics
  3. services
  4. guidance

Fundraising can help make a book project happen by resourcing high quality product (editorial, design); other publishing services (e.g. marketing, translations, publicity); replace advances; and provide a tangible demand measurement (40k likes on facebook may not equate to $500).

Matthew Crockatt from And Other Stories outlined their philosophy which is about combined intelligence of editors, readers, translators, critics, literary promoters and academics. They are a not-for-profit - community interest company who make decisions based on what they think is good writing and a good way of working. Their supporters can take part in determining the direction they go in and they try to be as ecologically minded as possible (e.g. using a local printer). The profits are re-invested in the work, allowing them to pay translators as well, for example. They also operate a subscription model: their first four books had 120 subscribers. The subscribers pay money up front without knowing what will be published and they are sent them in the post when available.

They include a lot more on the website than just the books including deeper information on the authors. Their mission: it's not about them publishing great writers, if they bring a great writer to the fore and a bigger publisher wants to publish them, that's great. On an international note they have reading groups arranged around a particular language.

Eric Hellman from Gluejar, Inc. outlined the vision for Unglue.it - which is about creating the Public Sector for eBooks. They want to provide a platform whereby you can give the whole world a book you love. The agent for the public sector is libraries. They convert that content into public good. However, when print transitions into digital space it becomes much harder to do that. When libraries try to do that, it ends up conflicting with the channel. In the US the big six publishers are very hesitant to lend those books. That is why they believe there is a need for a new public sector for digital books.

There are a multitude of initiatives including: Internet Archive; Ebook vendors; PLoS; Project Gutenberg; Europeana; DPLA; BookShare; and WorldCat to name but a few. However, there isn't a good way to bring in-copyright books that might not be selling into public sphere.

The Unglue.it business model:

  • library distribution = zero marginal costs
  • run a crowdfunding pledge drive for every book published to cover fixed costs to produce it
  • readers can choose the books they love
  • rights holders can set price they need to cover costs
  • so no need to wait 150 years to wait for the book
  • you can launch out to networks who may be interested
  • when funding level is reached, release a CC license on the book.

Tools of Change Frankfurt: Metadata Futures

Karina Luke from BIC introduced a panel on metadata for the future. Graham Bell from EDIteur began with an observation of how uncomfortable book publishers are with the concept of metadata. He provided an explanation of the fundamentals of metadata for the industry and how it can enable you to begin to discover new metadata within the data.

He went on to describe in more detail what meta and linked data are. Linked data expresses metadata as a collection of triples. It uses URIs to represent relations and things and prefers persistent HTTP URIs so they can be 'looked up' to get further details. This lets the data be 'self-describing'. He warned about Linked Open Data: this has an additional view added in which requires the data to be free and accessible, and counselled to bear this in mind as it may - or may not - be what you want.

Linked data is just another way of expressing the same data. Some practitioners have a loose view of semantics, that it's not best suited to the supply chain. You need to be selective about data sources, as the system is based around trust and expectations of persistence. There is a need for common entities, shared vocabulary and a standard approach.

George Lossius' presentation was called 'Navigating the Semantic Web'. He covered definitions of linked data - the semantic web - and why we need it, who is using it now, and the business benefits for the trade. Working in the semantic web isn't a scary thing: it brings you closer to the original, scientific view point, and it's fun.

The semantic web takes the web solution further by providing:
  • web of linked data vs web of documents
  • framework of emerging standards (W3C)
  • structured content - standard way of describing things
  • ontology
  • inference / relationship
  • interoperable
  • combination of data from diverse sources

'The semantic web is a little bit about us: it uses deductive reasoning and inference to do things you ask it to do.'

An example of a semantic website is Breathing Space, a pilot project that aims to explore the value to researchers of compiling and mining a critical mass of data within a discipline. Another example is GSE Research, which aims to provide a bridge between scholarly research and practice in the fields of governance, environment and sustainability. It was interesting to hear him note that the BBC website for Olympic athletes was populated by a semantic search.

Is it relevant to the publishing industry or to trade books? Yes. Your consumers are becoming more demanding, time poor and intolerant of waiting. So the job in the publishing supply chain is to make it easy and interesting so you don't lose your readers. What the semantic web gives you is the opportunity to create compelling, relevant and interesting material to create value for them and your business.

'The semantic web is about fulfilment: the fulfilment of books and the fulfilment of the right content to consumers at the right time.'

Beat Barblan from Bowker provided an illustration of how identification can be difficult online and how the ISNI helps. The ISNI is an ISO standard which uniquely and authoritatively identifies Public Identities across multiple fields of creative activity. For a full definition of ISNI read the website.

It will help with discovery, search ranking, identifying rights holders and distribution. It is the tool that can link the unique content to the creator. It is a bridge identifier that will link while showing enough to disambiguate. The rich content will be found elsewhere. There are just under 1.5 million ISNIs assigned and around 15.5 million provisional records.

Valla Vakili from Small Demons focused on the great chain of narrative in his talk. He focused on V for Vendetta as an extreme example of a great way for a narrative to break out into the world. It referenced so many aspects of history and life including Guy Fawkes. Data collected included:
  • book
  • character in book
  • chararcter's role
  • character's clothing
  • character's clothing was inspired by historical figure of Guy Fawkes
  • where to get the mask (which is also the highest selling mask on Amazon)

Howard Willows at Nielsen BookData closed with an overview of moving toward a single subject classification scheme for the global market. Drawing a comparison with the Tower of Babel, there is still a range of systems designed for local languages and confusion reigns (e.g. BIC, BISAC, SAB, RVM, YSO, etc). This system undermines their overarching goal and introduces inefficiency into the supply chain.

There is a gap in the metadata for trading partners across national borders even between divisions of mulitnational companies. The traditional fix is mapping and while this works, it only works up to a point. The problem with mapping is:
  • it's not a complete solution
  • there are often competing versions of varying quality with different outcomes
  • they tend to be either simple and inaccurate or complex and accurate.
Overall there is a degradation of quality and loss of discoverability which results in poor experience and degradation of sales. Mapping has been pushed to breaking point by the growth of digital publishing and online trading and has outrun interim solutions. 

'A global market needs globally understood metadata.'

The best and only viable long term solution is a single universal subject classification scheme. Who will benefit? Publishers through greater control over product data; aggregators through less data manipulation; as well as retailers and consumers through a clearer and much simpler supply chain.

As a result of this need, a new organisational structure has been put together, independent of BIC and any other existing company. THEMA has been born to ensure global subject class scheme stays free to use and truly international.

Tools of Change Frankfurt - Pricing Digital Content: Publishers and Consumer Perspectives

Ed Nawotka introduces the pricing panel
Ed Nawotka from Publishing Perspectives introduced the Pricing Digital Content panel including Ann Betts from Nielsen Books, Ashleigh Gardner form Kobo and Timo Boezeman from A. W. Bruna. He jumped straight in and posed the question 'whatever happened to free?'

Ann Betts provided research insight: people are more amenable to paying for content and in established markets e.g. the US that price is getting higher. Timo Boezeman believes that no one can live on free, you have to make money, you can use free content, but it somehow needs to lead to sales.

Picking up on this theme, Nawotka then asked where the industry can go from saying no one can survive on free? What impact does self publishing have on large publishers? Can they compete? Boezeman feels you don't have to compete with self-published titles, you compete on quality. Quality has a price and digital has it's own price. At Bruna, they use a matrix, but it varies from title to title and they like to experiment. They are not there yet in the Netherlands when compared to the US.

Ashleigh Gardner said that Kobo would like to see more on price optimisation: it's not printed on the back any more, so you need more experimentation with changing price. Price range is all over the place, but the $9.99 price is still a sweet spot. However a customer willing to pay that will still pay $12.99 and you can move to $14.99 without losing too many sales. When someone is looking for something specific, those titles can hold a higher price. She cited JK Rowling's publisher experimenting with pricing to find that sweet spot.

Betts, Gardner and Boezeman: the pricing panel
Betts believes that people are not as concerned with price as we think they are explaining that 15% of US consumers think the price is too low. In terms of price points $9.99 is most popular and takes c.20% share - but that price point is becoming stronger. $12.99 is now the second most popular price point for driving volume sales and $9.99 is down to fifth place. There is a shift from looking for the cheapest thing and buying in quantity. The market is becoming more sophisticated and looking for quality.

Boezeman finished with the view that if Amazon comes to the Netherlands with ebooks he can see a price war on digital starting. Their ebook price is 13-15 euros and approximately 70% of paperback price (this upholds Jo Henry's data from the opening keynote).

Tools of Change Frankfurt: Digital Textbooks, Online Learning and the Future of Educational Publishing

Sheila Bounford introduces the session
The first breakout session in the appropriately named Megabyte room was chaired by Sheila Bounford. She marshalled an insightful discussion on digital textbooks. Sheila was joined by Amir Winer from the Open University of Israel, Michael Cairns from SharedBook and William Chesser from VitalSource Technologies.

Winer provided insight into institutional developments on digital textbooks. They are moving from linear to fragmented delivery: to fragmented and modular texts. Their vision is a study guide with visual, audio and textual content, with editable academic text, video lectures, links to papers and interactive coursework.

Cairns observed that the growth in custom textbooks in the US has been driven by the big publishers, but institutions are starting to exert more influence over content and price, and are forging distribution content deals with publishers.

Key highlights from Chesser's talk include:
  • At VitalSource Technologies they delivered 5 million e-textbooks in 2011, with 2.5 million users worldwide on 6,000 campuses, working with more than 200 publishers, adding 10,000 new users on a weekly with 100,000 titles in 17 languages across 180 countries.
  • Chesser reported that the typical number of pages per visit c. 30, average visit duration was around 22 minutes: indicates more in-depth, reading and studying going on.
  • Successful criteria for digital texts includes: use – sell through; mission – transition; value – price point. Successful characteristics of digital text includes: distribution ease (for student and faculty); being an organic part of course.

Chesser went on to observe that based on this criteria he would grade the least successful approach as straight B2C. Rental is next, but beware that students have reacted negatively to this. The print + electronic model (from 10 years ago?) is next, BUT as publishers didn’t charge extra they were effectively telling the market that the value of digital is zero.

Selling one chapter at a time is more successful– not reconfiguring products. However, in most textbooks, the chapter isn’t as much of a standalone product as you might hope. At the other, more successful end of the spectrum, hardware is pre-loaded with digital textbooks. There is a curriculum sale or tuition-inclusive with a school-wide programme. This is the most successful model at the moment.

When they surveyed students who had actually tried both, if price and availability are the same, 47% said they would take the e-textbook, 35% said they would take both, and 18% print only. The market is potentially sizeable, but publisher content is not always available or optimised to deliver in a contextualised or consumer friendly way. 

The most powerful observation from the session for me was that many publishers already have appropriately formatted content in their journals programme. Just image what could be achieved to meet this nascent demand if you applied journal programme workflows to your textbook programme? Unfortunately, despite journal publishers having worked on this for years, book departments often don't work in that way.

Tools of Change Keynote 2: Andrew Bud - Mobile Content and Commerce: A Global Commerce

In the second keynote session at the Frankfurt Tools of Change conference, Andrew Bud from MEF - the global community for mobile content and commerce - presented early results of their global research.

It is a study of 10,000 mobile media users across ten countries - they tend to be mobile first economies - providing insight into the state of the market and how it is evolving. What makes mobile special?
  • Consumer's relationship to device
  • Ability to engage the consumer
  • 100% payment reach
  • Multiple, fragmented centres of commercial power
  • Technical fragmentation and constant change
The consumer relationship is key: the mobile is always there, is instantly accessible for immediate gratification. It's becoming the first screen due to it's immediacy and personal and that is becoming the case everywhere.

When consuming content on the phone, there is a significant long tail of genres catering for different segments of the market. Games are primary form of content with 59%. They are followed by social networks (49%), music (47%),  photos (41%), news (31%),  weather (25%), sport (21%), navigation (20%) and books are further down the ranking (17%).

The mobile is the ultimate engagement engine. It has the ability to drive the cycle - through SMS and push. It has known content using location, network and behaviour. There are accessible calls to action such as opening an app and you are only three clicks from desire to purchase: a feature that has driven the $30bn mobile content market.

88% of mobile media users are now using mobile commerce and it is ubiquitous in the first mobile markets. The reason? Proximity, convenience and immediacy of having that particular device so you can go from desire to gratification in the shortest possible time.

Mobile carrier billing provides a retailer with the ability to charge consumer up to c.$15 immediately without any needing verification. The price is steep - mobile operators in the value chain take 20-30% in charges - but it helps establish new retail channels.

There are multiple and fragmented shifting centres of power:
  1. mobile operators (banks coming in to this)
  2. handset vendors (social networks)
  3. O/S controllers
  4. online retailers (local providers)
Purchase location is shifting, consumers no longer buy via mobile web, but from places and brands they know and trust including 29% from app stores, 27% in app/game purchase, 26% on the mobile web and increasingly retailer mobile storefront (15%).

Technical fragmentation is key. Not all smartphones are equal. What is also interesting to note, despite the lower use of books on mobile, the kind of people who buy smartphones are the type of people who are likely to buy books.

Mobile is fast becoming the central ecosystem for digital content everywhere. There are opportunities to create new retail channels, powered by engagement and mobile billing, powered by its fluidity. Books are a small part of what is happening on mobile media and the mobile ecosystem is a small part of books today, but that is temporary and presents a big opportunity.

Tools of Change Frankfurt Key Note 1: Jo Henry on Consumer eBook Monitor Data

The (ebook) world according to Bowker
For the first session at Tools of Change conference on the eve of the Frankfurt Book Fair, Jo Henry from Bowker presented an overview of their Consumer eBook Monitor data.

Who are downloading ebooks? In general, they are male and a third to a half are under 35. The majority live in urban/surburban areas and half to three quarters are in work. A third have a degree (although this rises to 90% in India).

Future trends include i) moving towards more even male/female split; ii) becoming older with more 35+ in most markets; and iii) increasingly suburban. India is a massive market however the US is still the biggest market. There is a long tail across to New Zealand, with a small population and low growth.

Other interesting trends include:

  • engagement in ebooks is not slowing print purchases
  • heavy book buyers are usually promiscuous and buy/borrow from all channels
  • 10% of people who were not buying print books are buying ebooks

What is the role of free in the digital world? Free is driving engagement with paid digital content. If you are a free downloader you are two and a half times more likely to buy print while still downloading, unless you are in India or South Africa. In this survey they also asked about piracy. India and Canada are low in the 'never would download illegally'. Some consumers are conflicted and might consider downloading illegally if they couldn't get a legitimate copy.

Where there is a young market, the device most used for e-reading is a PC. The markets who've adopted e-reading devices most enthusiastically are Canada and the UK. There is also a significant number who read using mobile. Amazon has the strongest market share in the UK, US and New Zealand while Canada have a higher proportion from Kobo.

Attitudes to pricing are fairly consistent: consumers say they should be cheaper than print books. In most markets they think the ebook should be 50% of hardback and c.70% of paperback for adult fiction except in India where value is perceived to be more than print. For debut author prices need to be cheaper.

She concluded with the following observations:

  • growth rates are fast - particularly in emerging markets
  • engagement with ebooks doesn't always reduce print spend
  • free is driving the ebook market
  • the biggest players don't always dominate the market
  • 'E' is regarded as less valuable than 'P'.

Monday 1 October 2012

Sarah Price: Library Technology and Metadata - Measuring Impact

The afternoon session at To Measure or Not To Measure: Driving Usage seminar included a session from Sarah Price who is E-Resources and Serials Coordinator at the University of Birmingham and Co-Chair of KBART.

One of key things librarians are interested in is ensuring that the content they buy is easy to use, is discoverable and accessible for their students. She provided a candid and compelling story of how the University had got to grips with critical feedback from students on the eLibrary provision, and how they instigated a major review and development programme to address the issue.

Traditionally, there were two access points to content: traditional library catalogue (mainly for print collections) and the elibrary service. Both were accessed via the home page, but didn't take into account special collections and other services they had. The user interface was very text heavy, old fashioned and not very user friendly and you had to search separately for ejournals and ebooks, making the experience confusing, unattractive and a source of dissatisfaction.

As a results the University has invest in a Resource Discovery Service which provides:
  • single search interface and search box (with a Google-like interface)
  • harvesting of collections across institutions
  • much faster search and results retrieval
  • discovery at article and chapter level
  • post search filtering and refinement.
The service is publicly available - with no (upfront) authentication - as a taster for potential students and academics. However, if you want to access in-depth content you have to sign-in with your university account. It is designed to have no dead ends and is integrated with other web services such as the University portal. They worked with Ex Libris to develop the product and included embedded searching as a function.

They added the Primo Central Index to this product which is a very important part of the discovery service delivering article level searching. A user can also narrow research from 'everything' to specific collections or using advanced search. You can log in with your own personal account which then provides access to the full set of content and lifts restrictions. When using a search term, the results will indicate what type of resource it is (e.g. articles, books, etc.) Where it is a book, it will show stock and location of copies on a site-specific basis, even including a map of the location in library. Print and electronic resources are listed alongside in a discovery tool. You can see where terms are where you searched to check relevance and you can also facet or post-filter (e.g. by article, book, library site, date range, author, language, electronic database, etc.), and it will attempt to group similar records.

Another interesting feature for scholarly publishers is the link to the in-house reading list management system on each textbook. This is flagged at the foot of the entry and you can click through to see full reading list and then continue through to other titles and services. Crucially, this will be helpful in checking against your records whether an academic has added a title to a reading list or not after receiving an inspection copy.

The resource is embedded on the university portal my.bham within a MyLibrary tab. This is a primary source of driving usage to the site. It's early days for analytics, but at the start of term they have the same amount of traffic from my.bham university portal as from Google Scholar. In addition, index based searching is generating a lot of traffic from their users.

During the implementation they decided to:
  • still provide database level link to native interface function
  • provide library catalogue only search but within FindIt@BHam
  • 'everything' set as search default but enable a limit of scope
  • linking SFX component of Metalib library catalogue to reading list management system and the University of Birmingham Research Archive (UBIRA).
They dispensed with the A-Z list and pre-search limiters and now rely on post filtering facets. They also dispensed with ebook MARC records as metadata input and now directly harvest from SFX. It was a bold decision, but they have found that it works for them. There has also been integration of the single search in the portal and library services homepage.

Price flagged the importance of metadata for discovery. It supports linking to the appropriate copy; allows an appropriate set of links to be presented in a single place; allows the library to accurately and comprehensively display an entire portfolio; accurately depicts the entitled coverage for that user; and allows users to find keywords in full text - not just abstracts.

As 'Resource Discovery Service' isn't the most exciting or engaging title, they ran a competition amongst staff for the new brand name. There were 80 suggestions, but the winner - FindIt@Bham - was felt to tie in with the overall university brand well. They thought long and hard about integrating the Birmingham brand and used pictures of the distinctive campus to customise the out of the box product. They have integrated with the University portal VLE and embedded in the library Facebook page. Other marketing and promotion included:
  • social media
  • lots of work with the Student Guild
  • postcards/bookmarks
  • university staff and student newsletters
  • focus groups, training and briefing sessions
  • integration and prominent website advertising
  • university-wide plasma screens.
It's early days in terms of measuring impact, but they are assessing reviews of user feedback post-launch and have a continuous improvement strategy and post-launch authority group in place. They will analyse future quality measures, service and resource usage and benefits realisation. They are expecting to see a big hike in full text usage, are anticipating a massive impact to their ratings and anticipate seeing value added throughout the supply chain.

It has been interesting to compare to a Google Scholar set of results for certain specific searches. These only give generic results, not library entitlement. It has been interesting to note that the top result on a title is a pdf from JSTOR of a similar book to the one searched - their system is much more precise.

When addressing concerns about wider access to content, a demonstration showed that while Google will present the results, it won't present the full text unless they are free for access or the viewer can log in with an entitlement through the library system. The system doesn't embed authentication without library intervention - the link resolver.

Already, in comparison with Google Scholar searches, the Library discovery is context sensitive to the definition and results are more focused. Library discovery allows added value with resources grouped by subject and scholarly recommender services.

Her advice to publishers on how to integrate titles into the system includes:
  • send your title level metdata to link resolvers (KBART)
  • keep it up-to-date (cessations, title changes, etc)
  • provide your deep linking algorithm
  • allow discovery platforms to harvest your metadata
  • don't be exclusive, be promiscuous!
  • assess usage patterns following integration.
She concluded by saying that integration with library discovery tools is essential to drive usage. This needs to be based on industry good practice and there is a growing body of evidence supporting usage increase (and decrease) dependent on RDS integration.