Showing posts with label publishing. Show all posts
Showing posts with label publishing. Show all posts

Thursday, 30 March 2017

'Just do it': highlights from the ALPSP Open Access seminar

photo Martyn Lawrence


Martyn Lawrence attended last month's ALPSP seminar How to build a Successful Open Access Books Programme which was chaired by Frances Pinter.

He offers his thoughts on the day.

This one day seminar on Open Access monographs brought together a mixed – and refreshingly perky – group of publishers, librarians, funders and authors.

On the heels of the R2R conference, held on the preceding days, chair Frances Pinter set the scene in a room full of industry heavyweights, traditional presses, societies and start-ups. She had briefed the wide range of speakers to talk about challenges overcome and how their offer could be scaled up, not just to showcase their companies.

Here, rather than a blow-by-blow account of each presentation, I’m offering the top ten takeaways from a thoroughly enjoyable day.

1. Monographs are important

The tone was set from the outset. There’s an intangible thing with books: even though you can read on a device, there’s something about a printed book that provokes different emotions from a printed journal. Yes, chapters in edited collections are akin to journal articles (scholarly ‘stuff’ to use the language preferred by Toby Green and Tom Clark) but monographs, by and large, arouse different responses. That’s partly because of their dominance in the humanities and social sciences: because HSS research is so often about the idea, rather than the data, the venue for that idea is venerated – as is the means of expressing it. As the Crossick Report stated: ‘The writing of the long-form publication IS the research process’.

2. Books are under pressure

The problems are hardly new: low sales, declining library budgets, tough distribution, pressure to make publicly-funded work freely available and a changing environment in a platform-led world.

For some disciplines, it’s a relevancy issue in the fake-news, barriers-first world of Trump and Brexit. If STM creates new drugs and builds planes, HSS needs to explain what it offers. Indeed, as Rupert Gatty so eloquently said in favour of Open Book Publishers, it’s time to re-evaluate the entire publishing model. If access to your title results to a 300:1 success in favour of the open version (based on data from his presentation), it takes a lot of effort to justify prioritising the single digit. We should be able to communicate in more ways, not fewer.

3. HEFCE monograph policy

Funder attention and OA policies have hitherto focused on journals publishing, because of the desire to kick-start innovation and drive new business models. It’s also been driven by academic priorities in the big-money STM areas.

Ben Johnson (HEFCE) explained why HEFCE is interested in OA for all published outputs:

  • it leads to greater efficiency when university finances are stretched
  • it improves quality of research
  • it leads to impact and reach outside big institutions

A diverse system means that people can choose how they communicate. In STM, 98% of REF returns were journal articles. In HSS, by contrast, the monograph dominated.

The REF after next will require OA monographs, and pilots are being put into place for that. In ten years, there will be a significant percentage of OA books. The equivalent REF value isn’t yet given to e-monographs but that will change.

4. We’re going to play nice

The journey to OA for journals was heated and not always constructive. HEFCE hopes to avoid a repeat for monographs (which, given the expected length of the journey, is a blessing), and it’s worth emphasising that the atmosphere in the room was considerably different from the ALPSP OA event in June 2016 which focused predominantly on journals. There was precious little mention here of ‘drive your APCs’ or ‘milk the P&L’. HEFCE set the tone and subsequent speakers reinforced it: all parties should respect that the pace of change will be up for debate.

5. University presses may be the future melting pot for OA

Perhaps the most interesting news was that initiative for change is less likely to come from the legacy publishers, nor yet the start-ups, but from the growing cohort of university presses. Often housed within university libraries (and therefore with a strong mandate to champion OA), they are often far less reactive than the legacy publishers. Two careful presentations from CUP and Taylor & Francis bore this out: progress is cautious in the global publishing houses, partly because agitation from the author community is not high, and partly because of varying geographical and disciplinary opinions about open research.

In the UPs, by contrast, commissioning can be driven by ‘what’s the story?’ not ‘where’s the money?’. The rationale for editorial excellence is as strong as ever, but removing the pressure of profit margins means OA books can be more eclectic, more interesting, more exciting than ever before. ‘The value to the university is in profile and reputation, not in income’, said Sue White of University of Huddersfield Press. No one is going half-measures on this, either. As Lara Speicher (UCL Press) noted, authors are watching closely and they’ll quiz publishers over their sales and marketing plans for a title. Having said all of that, the (small) list of OA books published by CUP was notable for its breadth and quality: there’s no indication that OA diminishes the value proposition for readers.

6. Systems really stink

Publishers don’t build systems to give away books for free. OK, so there’s a wisecrack hiding there, but try as you might, it’s really difficult to convince a legacy e-commerce system to offer an article or an entire book with a zero price tag. They simply weren’t built with OA in mind, and rescaffolding sites is one of these things that everyone assumes is easy until they try it. Time and again, this issue emerged as a remarkable stumbling block.

7. Discoverability ain’t great either

Three kinds of metadata are needed to make an OA monograph fully discoverable, and they are non-negotiable, functional essentials:

  • content (eg keywords and BIC codes)
  • digital (eg DOIs, ORCiDs, ISBNs)
  • OA-specific (eg specific CC license for both articles and images, embargo period, funders, location of Version of Record)

Without this, scalability of OA programmes will prove tricky. It doesn’t help if third-party vendors don’t make it clear that a print book is digitally OA, or if elements of the metadata drop out on the book’s journey through the post-publication environment. (I was reminded at this point of a recent Scholarly Kitchen piece by Jill O’Neill, in which she described the convoluted process of tracking down what she called ‘an OA monograph in the wild’.)

Simon Bains, Head of Research Services at the University of Manchester reinforced this point. Unless metadata is strong, Manchester doesn’t give OA books the same priority. In Bains’ view, JSTOR discoverability is good; OAPEN and DOAB are poor; Hathi Trust and Internet Archive are non-existent. They also prioritise reading list books.

As Euan Adie said, ‘metadata is a love-letter to the future’. Without it, OA founders.

8. OA encourages audience-first publishing

Some of the most fascinating presentations came from researchers. Vanesa Castán Broto, Senior Lecturer at UCL, made the forceful point that if academics are not inspired to produce something, they will drag their heels. Broto was adamant that she didn’t want to produce something held only by an elite group in the English-speaking global north. Her OA research on Mozambique, published by UCL Press in English and Portuguese, has seen downloads in 152 countries: ‘it’s a massive incentive for me to publish open and in a language other than English’, she said. This motivation, she said, trumped any accusations about OA vanity publishing.

Broto’s conviction raised an important issue. The bigger publishers are ploughing time and money into an OA monograph programme as a business need: they’re packaging it as part of a wider author services offer. By contrast, the authors are taking risks because they are in the business of communicating their research discoveries to the widest possible audience. In one sense, these factors are symbiotic: authors need publications to be widely available, and publishers are in the business of making that happen. But it’s intriguing to see how these two different rationales will converge, given the issues of scalability and sustainability. For the most part, publishing ‘closed’ in the right journals is still more important than publishing ‘open’ in smaller journals.

9. OA enables innovation

Book launches kill budgets, but authors love them. So in a platform-driven world, what’s the alternative? Online parties, says Xinyuang Wang at UCL, who reported on a campaign supporting her OA book with a MOOC and YouTube videos translated into multiple languages. The greatest impact of this was the means of attracting new and wider audiences to the work. It’s an audience-first model that legacy publishers will struggle to match.

The larger point seemed to be that OA publishers, particularly those without legacy models to protect, are potential incubators of innovation. Without a cumbersome legacy model to restrict format or dictate price, they can engage more fully with the long tail of high quality titles. Diversity, said Andrew Lockett of University of Westminster Press, has much greater value once you’re not obsessed with the US library market.

10. Print isn’t going away

Despite everything, physical books still make a difference. Ultimately, that’s why the transition to OA monographs has taken so much longer than journals. Lots of university presses are offering books as short-run PODs (often 100 copies) to ensure they cover demand, and OA isn’t replacing print. This was the funding message too: academic choice is a big part of the HEFCE approach. Data from Brill and UCL suggests that print sales are not decimated by OA (it’s the effect on ebooks that is more notable).

And this is what’s so interesting – the mix keeps us going. When it comes to OA monographs, what do we want? Everything.

Martyn Lawrence is Publishing Manager at the Royal Armouries Museum, with oversight of the books programme at the museum's three sites (Leeds, the Tower of London and Fort Nelson). He is a frequent contributor to international publishing workshops and training events, including seminars for ALPSP and London Book Fair, and he has chaired numerous conference sessions around the world.
@martynlawrence

ALPSP organises a full professional development programme of seminars, training workshops and webinars. See www.alpsp.org for details.


Thursday, 26 January 2017

The Challenges of Outsourcing Part Four: Supplier selection (top tips 6-10)

photo Lorraine Ellery Matthews
In this post, Lorraine Ellery Matthews continues to share feedback from leading scholarly publishing professionals focusing on practical advice and further top tips (Tips 6-10) to consider when selecting a supplier to outsource your product or service.

The first post in the series identified 10 key drivers to outsource and the second post outlined Stakeholder Engagement.

Part three identified five top tips for selecting a supplier

1. What impressions do you have of a supplier?
2. Ensure effective communication
3. Forge positive relationships
4. Engage in early collaboration
5. Look under the hood

In this post, she outlines top tips 6-10:

6. Consider time and resources

"When moving suppliers you need to have built in sufficient post-implementation time to ensure that the quality of the core service is at least as good as it was before the change. The call on time and resources and the resultant loss of momentum should not be underestimated - you want to be able to reassure your customers, and ideally involve them in the ‘live’ test phase." Daniel Smith, semi-retired Publisher and Consultant, previously Head of Academic Publishing at The IET

7. Obtain recommendations and feedback

Requesting references from a supplier should be a given and once received, ensure you ask for feedback from different roles within the organization that deal with the supplier, including those that deal with the day to day communications. Ensure you ask the supplier’s customers how their transition went and their current experience.

Obtaining recommendations from a trusted source of your own can also provide you with a more in-depth insight than relying on those references supplied alone.

In conversation with Simon Laurenson, Operations Manager at Bioscientifica about the switch to a new eCommerce supplier, Simon shared some helpful advice:
"It's important to speak to other publishers and get as much advice as you can, with lots of society owned publishers our size we have the opportunity to exchange notes." 

A word of caution, however. When talking to another publisher/organization you need to consider that at the time of talking with others, they may have been optimistic so it's best if you do not rely on this feedback alone.

All the publishers I spoke to will provide clear feedback if asked by the suppliers once the RFP/Tender process has ended, recognizing the time and effort invested by the supplier and to aid continued improvement. At the end of the selection process do also ask suppliers for feedback from their perspective too. For example: How did your RFP compare to others they have seen and what other questions could have been asked?


8. Expect the unexpected 

Expect the unexpected cartoon of jack in the box


Even with good planning, it's hard to know exactly what to expect back from a supplier.

Ove Kähler, Director Product Management & Global Distribution at Brill speaks about proposals he received back from suppliers:

"We didn't expect to get responses back that were close to 150 pages long. Also, the diversity of proposals and different aspects of pricing made it difficult to compare them. We tried to prevent this by providing vendors with the Excel version of our requirements. Even though they filled in the sheet and highlighted what was in scope and what wasn’t, it was still a challenge to get a good easy overview of how the proposals compared."


9. Quality assessment

In discussion with Jeremy MacDonald, Director of Technology at Pharmaceutical Press, quality assessment was raised as a key undertaking when developing an APP and he emphasised the importance of selecting a supplier that can get the data right first time:

"There are multiple internal challenges that arise when trying to present data that needs to be clinically correct. People are using this data to make important decisions about other people including children so there is a high level of responsibility for you as the publisher to get it right. When developing an APP, you are having to create your content set in different media so have to ensure quality assessment has been undertaken before presenting the data.

Getting an App to work across different platforms is a challenge with the different versioning of devices e.g. IOS 5 and IOS 10 - things evolve forward and therefore you need to ensure the App can continue to work with different devices and platforms. Upgrades are open to error and we have to continually test and modify the code base."

Testing partners
It's not always possible to test different versions of iPhone for example within your organization. Therefore, the publisher found test partners who were able to undertake the testing for them. Everyone in-house at the publisher runs Windows 7, therefore a SaaS partner was also needed to test their browsers and Apps on different desktops.

10. Planning

Finally, if things do NOT work out it can be really painful so it is important to ensure you have agreed on a project plan with you supplier that is prepared for every contingency. A good project plan will allow you to monitor the stages of a project development, make adjustments where necessary and maintain a momentum with your supplier through to implementation and beyond.

Daniel Smith, in discussing hosting services noted "A properly formulated project plan would allow at least 6-12 months of post-implementation activity to ensure that the service is fit for purpose and where it is not, there is time and resource to put it right."

It is important to be realistic with your timelines. If you have to change supplier at short notice, you need to be accommodating in terms of what you expect of the new supplier. If you are pushing a tight deadline on them, you need to be aware this may cause longer term problems.

Effective communication and forging good relationships will go a long way to ensuring a successful project and outsource partnership. However, to avoid frustration, don't forget also to plan for the expected as one publisher suggested: "Everything can stop for Christmas!".

cartoon illustrating everything stops for Christmas

Do you have any further tips or thoughts on this topic that you would like to share?

Join Lorraine Ellery Matthews who will be chairing the Outsourcing Challenges workshop at the forthcoming Research to Reader Conference in London on 20 and 21 February 2017.  The workshop poses the following question to the wider community:

Which aspects of the scholarly communications process can be outsourced, how can risks be mitigated and how can outsourcing be most effectively managed?

Register here for the 2017 Research to Reader Conference.




Friday, 20 January 2017

The Challenges of Outsourcing Part Three: Supplier selection (top tips 1-5)

photo Lorraine Ellery Matthews
In this post, part three of the Challenges of Outsourcing, Lorraine Ellery continues to share feedback from leading scholarly publishing professionals focusing on practical advice and tips to consider when selecting a supplier to outsource your product or service.

This follows the first post in the series identifying 10 key drivers to outsource and the second post outlining Stakeholder Engagement.

1. What impressions do you have of a supplier?

You may have already undertaken an early RFI (request for information) from a supplier, carried out your own research or commissioned a competitor analysis of the supplier landscape. However, to really get a good impression of the supplier, you need to consider how they are positioned and what they can bring to your business. You may find you need to engage in a more formal RFP (request for proposal) or tender process.

The RFP/Tender process should provide a framework that allows you to obtain a clear impression of the supplier; the product and service offering they provide; how they will address and meet your requirements and a clear indication of how they stand out from their competitors. The following key questions need to be answered before reaching an outsourcing agreement with a supplier:
  • Can they understand my business? 
  • Can they meet my requirements? 
  • Can they offer service reliability? 
  • Customer service, can they carry through on their promises?
  • Budget and ongoing costs, are these visible and agreeable? 
  • Have we agreed on critical value objectives
  • What is their value proposition, is this measurable?
  • What is their track record in servicing similar organizations?
  • How will their location affect me and my organization? (onshore or offshore) 
  • Do they have the ability to communicate effectively? 
  • Will I have Insight into their product/service roadmap? 
  • Are there signs of instability? 
  • Will our external auditors accept them?

If you are happy with your current supplier then you may not wish to spend time in going out to tender (unless you are obliged to do so) or in progressing with a formal RFP process. It is, therefore, advisable to first consider the opportunity cost involved!

2. Ensure effective communication

It is very important to mitigate against misunderstanding and incorrect interpretation usually caused by poor communication.
Cartoon illustrating importance of effective communication


I asked "is stakeholder engagement key to effective communication?" in my last post and wish to add that it is advisable to ensure you have agreed a clear project scope and selection criteria with relevant stakeholders before you reach out to suppliers. Whether this is before embarking on an informal or formal tender process, the preparation will take time, but this investment will ensure your objectives, goals and requirements are clear, assist you in managing expectations, provide a frame of reference and help take the emotion out of the supplier selection decision.

Clear expectations
Caroline Burley, Journals Operations Manager, Publishing Services & Production at the Royal Society of Chemistry shares her thoughts:

"Suppliers may say “yes we can do that” straight away without taking the time to fully understand what we are asking them to do. To ensure the supplier fully appreciates our requirements we try to make the documentation we provide as clear as possible and work through examples with them, providing feedback so that they can understand exactly what we are asking for.

If everyone is clear on the expectations and in agreement, then you should be able to work with the supplier to agree a realistic timeline to deliver the service. If they are not clear on what you are asking them to do, they may have to do additional last minute work that they were not anticipating which may affect the delivery time. Or if, as the customer, you are pushing for a fast delivery, it may be the case that they can't do all of the preparatory work before you go live.

You may need to be accommodating in terms of what you expect of the new supplier if you have a tight deadline, but be aware this may cause longer term problems once the motivation to finish the preparative work is removed.”

Agree to a SOW (statement of work)
The Head of the Journal program for a large medical society which completed several transitions to new vendors in 2016 recommends:

"Publishers and the selected vendor should agree in a SOW (and in great detail) what will be delivered and when with compensation or penalties for non-delivery. This is especially important for key workflow processes and functionalities.

Ideally, the consultant and client staff should document or record meetings with vendors during the RFP process and make sure that all parties are in agreement on what was presented and discussed to minimize disagreements and differences in interpretation at agreement and transition stage."


3. Forge positive relationships

"There is a lot to say for a relationship that is a positive one, and where you and the supplier are clear about your expectations both ways. Not just in a contract or agreement, but in terms of the actual interaction with the people you are working with. As the sales cliché suggests 'people buy from people first', there will be relationships that are particular to the organization and not every publisher is looking for the same thing." Director of Publishing, leading UK society publisher

4. Engage in early collaboration

Collaborating with consultants:
Collaborate with consultants to help distill information and facilitate discussion to allow stakeholders to talk through their frustrations - this feedback can strengthen the value of the final offering.

A consultant can ensure you select and retain the partners that best fit your organizational goals, provide advice and manage all or specific components of the RFP/Tender process, including the following:
  • Early research e.g. Competitor analysis 
  • Project management and advice
  • Support and assist in management of the selection process including creation of the RFP/tender documentation; evaluation of responses received, selection and brokering of agreements
  • Create and manage the implementation of contracts and SLA’s 
Matthew Cianfarani, Director International Business Development, Mark Allen Group collaborated with a consultant when outsourcing their hosting platform. Matthew suggests
"Use a consultant early to shape the whole thing – if you don’t have a large IT team who understand academic publishing you need an outside view of what you do, to communicate to potential vendors."

Early involvement of technical experts  
Sometimes you will find that you are dealing mainly with the commercial staff at the supplier end and it is not until an agreement is signed that the hand over to technical experts begins.

"Involvement of a technical person earlier in discussion provides continuity." Helen King, Digital Strategy Lead at BMJ

4) Look under the hood

Cartoon illustrating Look under the Hood


Before buying a car, you would look under the hood, before buying a new outfit you would preferably try it on. Therefore, why not apply this common sense when outsourcing?

It can take commitment of both time and finance to try a service before you buy, however, it can certainly be worth the investment as Helen King, BMJ found: "Working with a potential supplier to test their services before you decide to move platform can provide you with a large amount of qualitative information about the service and if it will be a good fit."

Sign up using your email to receive blog posts in this series, post four will highlight a further 6-10 tops tips to consider.

Lorraine will be chairing the Outsourcing Challenges workshop at the forthcoming Research to Reader Conference in London on 20-21 February 2017: Which aspects of the scholarly communications process can be outsourced, how can risks be mitigated and how can outsourcing be most effectively managed?

Register here for the 2017 Research to Reader Conference

Monday, 5 December 2016

The Challenges of Outsourcing Part Two: Stakeholder Engagement

In this post, part two of the Challenges of Outsourcing series, Lorraine Ellery Matthews continues to share feedback from leading scholarly publishing professionals. In her interviews she has asked about the involvement and engagement of stakeholders in the decision making process of outsourcing, what was planned and what was unexpected!

This follows the first post outlining the 10 key drivers to outsource.

Formalizing the process

One organization were lucky enough to have a specialist, qualified procurement team in place reporting to the Director of Operations. The team are responsible for all contracts with a value through its lifetime of £150,000 and over, this includes everything in terms of production work including the typesetting and printing.

However, engagement of outsourcing as a service hadn't always been managed in this way and started with just one person, albeit someone who came to the organization with a high level of industry experience working with both off-shore and on-shore suppliers. Fast forward several years and now with contracts for all suppliers and best practise from start to end in place the organization has certainly seen the benefit of having the process formalized, even if there are occasions when they have been challenged internally regarding the costs involved!

Should it be left to one team to reach decisions?

Many organizations will not have the benefit of a dedicated procurement team and instead will utilize the members of the relevant department or create a temporary cross departmental project team to manage the process. Decision making is varied, and much is dependent on the complexity of the required outsource service. Some teams had the relevant skills and resource to reach a decision without having to reach out to others in their organization. Other's had clear policies in place to ensure the decision is overseen and agreed by a committee or management team or both.

Although the final decisions maybe undertaken by a small group the input from start to finish was in most cases (although not always) widely sought.

Is stakeholder engagement key to effective communication?

Accountability is of course important as despite good intentions and best laid plans put in place to ensure that a partnership is successful (supplier relations will be discussed in my next post) there is always a risks that things don't go to plan. It is therefore important to ensure that internal stakeholders outside your team are approached early and invited to input their thoughts and provide feedback as these individuals are then more likely to support your decisions in moving forward (partners in crime). Obtaining buy-in from others will allow you not only to share the projects success but also accountability in the event of any problems down the line, reducing the risk of receiving "if only you had asked me first" comments when it is far too late.

Facilitating discussions to allow stakeholders to talk through their frustrations can strengthen the value of the final offering.

The feedback that derives from engagement also helps when considering if you have taken all the risks into account and to stop and ask if you are making the right decisions.



One platform manger took a unique and possibly risky approach when it came to external engagement which worked in his favour:

"My biggest success was not to have found the right platform vendor but to get the management board on side."

The manager was approached by a scholarly publishing organization committee looking for a keynote speaker for a conference with a theme based around "how to manage different content types". The manager had a solution, a vision for their own organization that had continued to develop into a wider vision in parallel with their request for proposal (RFP) process that was underway at the time.

The keynote presentation took place before the developing vision had been fully approved internally. The manager received positive comments and support for his vision from the industry delegates attending and these comments also directly reached the organization's management team. This approach may have been risky but one that was not regretted as subsequently the internal buy-in was significantly supported by the external buy-in generated by the industry delegates present.

To engage or not with customers?

The interviewees were asked about the level of engagement with their communities and customers during the outsourcing process. The response was mixed and some key factors for consideration emerged:

1. Whether people feel attached to the system or not!

The quality of feedback is more likely if people feel an attachment to the system. Attachment is less likely for production systems than it is for say peer review or hosting solutions.

For example, editorial board members may have to interact with the system on a regular basis, acting as academic editors as well as overseeing the peer review process. The authors and reviewers will have a lot of experience using various systems too, especially given that there are only a handful of major suppliers in the market.

The same is true with hosting platforms, even though people may not realize they are experiencing them they often have strong views about how they see things and how things are presented online.

2. Concerns over alerting customers to change

Some expressed that although they engage with their customers throughout the year they would not necessarily inform them before the decision to change suppliers was made and in some instances not until after implementation. They felt that through ongoing continuous engagement they will already have built up a good picture over time of the customers requirements and do not see the need to alert them and risk the cause for unnecessary concern.

3. Is the supplier set up to engage with the customer?

The customer is a critical partner and large publishers are likely to have user groups, focus groups and advisory boards in place to support ongoing engagement with customers throughout the year. However, a smaller publisher may not yet have these forums in place and therefore, as one interviewee recounted, was able to benefit from the supplier's ability to engage their customers directly to provide feedback on their system.

4. Engage only with a select group of customers

To ensure quality feedback you may decide to take the middle ground and obtain feedback from a select group of customers that you know you can rely on. As one publisher interview stated "if you are brave in the process you may also want to invite one or two customers in the testing of the system."



Supplier engagement

A key aim when considering outsourcing or moving to a new supplier is to establish a long term partnership.

Forge relationships early so that when you are looking to outsource or move supplier you already have trusted relationships in place.

  • Suppliers may develop new technologies, or form strategic partnerships to provide an improved or new solution for their service that you may benefit from, therefore, it is advisable to keep abreast with their developments.
  • Understand the competitor landscape and what options are available to you, even from those outside of your own industry.
  • Communication is much improved if you match relationships at various levels within both the supplier and your own organization.
  • Feedback will help the supplier develop their solution further and is important at all stages of engagement, including following an RFP whether the supplier is selected as your partner or not.

Sign up using your email to receive blog posts in this series, the next will be focusing on The RFP Process & Supplier Evaluation.

Lorraine Ellery Matthews will be presenting The Challenges of Outsourcing sharing further recommendations from leading publishing professionals on Wednesday 7 December at 2.15 p.m. on Stage 1 at the London Info International exhibition. Attend and join in the discussion – booking available here. Exhibition visitors can register for free.

Tuesday, 27 September 2016

The Changing Role of Society Publishing

Some in our industry have publicly and privately opined that society publishers suffer from low business acumen. A “can’t see the forest for the trees” myopia impedes their competitiveness in a market dominated by deep-pocketed commercial publishers who have the “W” (WIN) gene embedded in their organizational DNA.
The big-revenue commercial and university press publishers get the lion’s share of library budgets, submissions, citations, APCs, and media coverage. A common perception is that they innovate better and faster and make smart, bolt-on acquisitions to strengthen their market-leading positions and to even reshape the market while society publishers increasingly struggle to compete because of declining revenues from member dues and publications and slow-to-decide, risk averse staff and governance structures. Are these perceptions accurate? Is future success for society publishers tied to commercial publisher partnerships and a quest for size and scale?

David Sampson, Vice President and Publisher for Journals at the American Society of Oncology chaired the penultimate panel at the ALPSP Conference. He believes that culture determines and limits strategy. We need to understand the organizational structure of non-profits; directors have the power, not shareholders. Strategic planning involves creation of vision and mission statements, initiatives, financials and metrics. Revenue forecasting often forgets that customers are in control of revenues. You need unparalleled customer service. Don't be afraid to kill failing programmes and don't be afraid to innovate.

A key element of ASCO's culture is to connect internally and externally. They have joint clinical guidelines to help identify cross-disciplinary work and connect with other associations for events on care for those with cancer. Embracing disruption of societal changes, technology and partnerships are key to the future success of a society. Readers and researchers are becoming increasingly connected with each other; we must connect with them.
Leighton Chipperfield is Director of Publishing and Income Diversification at the Microbiology Society. They have six journals, with £3.3m annual turnover; combining in-house staff and outsourcing. He noted commercial publishers filled the gap created by society publishers' failure to adapt to contemporary conditions. He believes being second to market is fine when it comes to technology. Why would he risk society income on that? They work with technology partners so they can take advantage of a service that has been developed by many publishers.

They love initiatives that can be applied in a cross organizational way such as (ALPSP Awards Highly Commended) ORCID. Things are changing, society publishers are modernising. They tried collecting APCs themselves, but it didn't work, so they partnered with Copyright Clearance Center. Chipperfield believes that the power of societies' collective knowledge is huge. Stick to what you are good at. They have some fantastic assets: high profile expert trustees; journal editorial boards; conferences; and expert members.
Kathleen Fitzpatrick, Associate Executive Director and Director of Scholarly Communication at the Modern Language Association, was inspired to join the panel to debunk the percetion that societies are risk averse. Member needs must outweight business needs and that tension puts them in an interesting position. They launched MLA Commons, a social network for members in 2013, allowing conversations to build beyond conferences. It is an open platform and has a repository at its core.
Simon Inger closed the session by providing some anonymous society publisher case studies. He mapped the journeys of organizations who adopted different strategies. One of the most common mistakes that societies make is to stop worrying about content when they partner with commercial publishers. You need to keep a strategy overview and management watch, but these are not always easy. With declining incomes a society is reluctant to invest in improving its own staffing. This can in turn lead to other issues. He has seen a lot of badly negotiated contracts.


The Changing Role of Society Publishing was the final plenary session at the ALPSP Conference 2016. You can view the video on the ALPSP YouTube channel.

Monday, 26 September 2016

What does academic engagement mean now?

Isabel Thompson, Market Research Analyst at Oxford University Press, chaired the morning plenary on Thursday at the ALPSP conference with a session focusing on the changes in publishers' engagement with academia and researchers. She noted that academics don't care how publishing works, they just want it to work. Researchers are readers, authors, peer reviewers and editorial board members. As a publisher, you have to find right voice for each one. Without academic engagement there is no publishing.

Dr Philippa Matthews us a Wellcome Trust Research Fellow based at the Nuffield Department of Medicine in the University of Oxford. She is also Honorary Consultant in Clinical Infection at Oxford University Hospitals NHS Hospital Trust. She talked through the results of a survey she conducted in advance of the conference. She is very interested in engagement with schools, also infographics. Wants to share results and resources. As a researcher, life is complicated, a simpler publishing process would be preferable. There are significant penalties imposed if her work isn't open access. She outlined a few gripes around the publishing process:
  • we don't accept pre-submission enquiries
  • hard copy signed conflict of interest statements are required before submission - can be a very long-winded process!
  • COI statements need original signatures from all authors... on six continents... at submission!
  • multiple revisions before rejection for incorrect trial format
  • new reviewers introduced after rounds of revision
  • length of time between submission and publication.
Matthews sent a survey to colleagues and received over 100 responses. Results showed that researchers are happy with peer review, but not the timeline and available support. 47% felt the publication process didin't support innovation or allow creativity. People obsess on the Impact Factor, but it's broken. She closed on a more optimistic note: there is a willingness to discuss this from all parties.
Dr Emma Wilson is Director of Publishing at the Royal Society of Chemistry. She outlined how much effort they put in to maintaining a two way dialogue with their community. This involved a lot of scientific conference, international engagement, and by building in-house teams in other countries. They support students and early career researchers via poster prizes and emerging investigator issues of journals. They use social media, but mainly for broadcasting information about them. However, it is growing in importance through initiatives such as through the Twitter-based poster conference.
Dr Sacha Noukhovitch is Executive Director and Editor in Chief at the STEM Fellowship/STEM Fellowship Journal. He feels that with open access, an unexpected, uninvited readership appeared spontaneously - students. A new generation of data-native students is tapping directly into research papers alongside professionals. These students lack the background knowledge, but they use their data skills to understand and interpret the world. If one students finds a paper interesting, others swarm to it creating a real buzz and students use academic communities to help understand complex concepts. They approach parts of the editorial process in a very different way, something that publishers need to follow and engage with.


The ALPSP Conference was held at Park Inn Heathrow London on 14-16 September 2016. View the videos of the session on the ALPSP YouTube channel.

Wednesday, 14 September 2016

Plenary 1: The Conversation: Research and Scholarly Publishing in the Age of Big Data

Ziyad Marar is Global Publishing Director at SAGE Publishing. Chairing the first plenary session of the ALPSP conference, he engaged his colleague Ian Mulvany, Head of Product Innovation, and Fran Bennett, CEO and co-founder of a big data company Mastodon C in a conversation about publishing in the age of big data.

Is big data hype and nonsense - just an exciting term that let's an agency sell their services? Fran Bennett believes there are some fundamental things that have changed that mean it is so much more than that. It can help companies open up new insights, generate additional income and lower barriers to technology entry. As the technology gets better it can do different applications. There is more data and cheaper processing.


Mastodon C are working with the UK Government department responsible for animals and farming. They are collecting all the data of dead livestock. They don't have enough staff so sometimes patterns get missed. They use computers to identify any of these threads to analyse post mortem. They can take messy structural data and sorts it out so expert humans can use their time more effectively and in a targeted way.

Ian Mulvany thinks high quality content is what we do as an industry, but it's all digitally mediated content. All publishing organizations need to be technologically competent. We're in a mixed world of software solutions that are beginning to be commodified. But the variety of the services around them are living in a handwritten world: a dilemma he is endlessly fascinated by.

Corporate applications of big data can transfer to publishing in market projections, customer retention, internal SWOT analysis and with hiring. Mulvany asks how many publishers have tried to re-analyse their entire corpus using big data techniques? Not many hands went up... there are lots of opportunities here. Bennett observed that a good data scientist is a statistician who can code and understand the context of their data and warned against tracking things purely because you can: the risk is you create 'data exhaust' that you can't do anything with.

Mulvany noted that some fields have long worked with big data and have good standards and procedures to deal with it. He is particularly interested in working with researchers that have realised they have a whole load of data and don't know what to do with it. There is a 'data under the desk' problem. Data is collected sporadically, is not necessarily kept well, and isn't large scale.

Caution was called for by delegates in the audience and on Twitter when using algorithms for peer review: it can and will be exploited by researchers. The panellists all agreed that machines can do the dirty work for us, but not all the work.

Marar outlined the work of the Berkeley sociologist, Nick Adams, who is using crowdsourcing and algorithms to look at reports on the Occupy movements in nine cities. Analysis that would normally have taken 15 years has actually taken one year, and is finding interesting patterns. He also cited the work of Gary King, a Harvard social scientist who is developing and applying empirical methods in many areas of social science research, focusing on innovations that span statistical theory to practical application.

Social researchers are coming more slowly to big data analysis, but are doing some unusual work with it. SAGE Publishing has conducted a massive survey into the area of data and social science with over 13,000 responses. It's something they are focusing on as a priority.

An interesting side issues when looking at social data is sometimes, when you look at the data, you find that the quality of it is not what it might be, with potential to lead to data protection breaches on a grand scale. There are differences between ethical and legal behaviour concerning datasets. it may be cheap to capture and hold data, but expensive to extract, clean and deliver it.

Mulvany closed with the observation that there are researcher needs, potential development tools, but why should the industry care about these things? Because at our heart we are about democratising knowledge and finding the right solutions and people around that knowledge. If we look purely at their purpose it will give us the realisation on how we make it happen. Those tools are becoming cheaper to experiment and innovate with. So we should do so.

Ziyad Marar is Global Publishing Director at SAGE Publishing where Ian Mulvany is Head of Product Innovation. Fran Bennett is CEO and Co-Founder of Mastodon C. They took part in a panel discussion at the ALPSP Conference 2016.

Thursday, 7 July 2016

10 ways to do database marketing badly (and how to avoid them)

There's nothing quite like a summer birthday, is there? ALPSP member DataSalon are celebrating 10 years of helping publishers with the challenges of data quality and customer insight.

We spoke to their Managing Director, Nick Andrews, who shared a little bit of wisdom gleaned from all those years' experience.

"We've learnt a lot over the years about the wonderful world of database marketing, and how things can sometimes go a little wrong if the right tools and processes aren't in place. You'd be amazed at what gets through internal quality checks: some of it embarrassing, some of it downright cringeworthy.

As we reflect on ten years helping publishers avoid making mistakes, here are 10 ways to do database marketing badly (and how to avoid them)...

1. Call your customer "Ms Ass"

Or "Ms Ass Librarian" to be precise. Yes, this really happened. Somehow the job title of "Ass Librarian" ended up in a customer's first/last name fields, leading to a very unfortunate address label. Some basic checking and clean-up could have avoided this particular mistake.

Yes. Really.


2. Get their name (and gender) wrong

Unfortunately, overly vigorous data cleansing can also be a problem in its own right. Our Communications Director Jillian (female) regularly receives post addressed to "Julian" (male), presumably due to a software rule deciding that her real name must be a typo, and unhelpfully "correcting" it. Moral of the story: do clean your data, but try not to make it worse.

3. Try to sell something they've already bought 

With the complex world of package and consortia deals, this probably happens to unfortunate sales staff way more that it should. You send prospects a tempting deal... only to discover they've already bought the product in question. Properly getting to grips with your sales data isn't always easy, but it is the only sure way to avoid this type of embarrassment.

4. Try to sell something they've absolutely no interest in

Another awkward sales scenario: alienating your (potential) customers by trying to sell them products which don't match their interests. The "hey, let's just include everyone!" mailshot is a great way to do this. And the "hey, let's get our data together and do some proper segmentation!" project is a great way to avoid it.

5. Don't respect opt-outs

Ah yes. There is perhaps no greater way to turn a potential customer into an angry ball of rage, than to keep marketing to them after they've opted out. Companies don't do this intentionally of course, but plenty do it by mistake - often when opt-out requests aren't properly consolidated across different customer databases behind the scenes.

6. Don't communicate with opt-INs

Not respecting opt-outs definitely annoys customers, but so does neglecting to communicate with customers who are interested. If John Smith has taken the trouble to tick the box and opt in to your news and offers, you’d better send him some. Asking customers to opt in sets the expectation you'll have something useful and interesting to send their way.

7. Send far too much email

Many people are perfectly happy to receive relevant promotional messages from time to time, but nobody wants to feel bombarded on a daily basis. This can often happen if different departments or divisions are all marketing to the same pool of contacts, without coordinating their efforts to keep it to a reasonable level. A company-wide comms strategy should help solve that.

8. Get your facts wrong

It can make for a really compelling message to merge customer-specific details into your marketing emails, for example: "Your recent high/low usage of product X suggests you're really loving/hating it!!" But of course that's only impressive if the key facts are correct (and it makes a bad impression if they're not). Be sure of the quality and accuracy of your underlying data before trying this type of campaign.

9. Send marketing to the deceased

At its worst this mistake can be very upsetting for relatives of the deceased. There are services like Mortascreen out there to help remove deceased contacts up-front. But even without that level of checking in place, the most important thing is to make absolutely sure that any notice that a contact has died (often sent via email to customer services by a relative) is acted on promptly to ensure no further marketing is sent ever again.

10. Assume everybody has one unique email address

It's easier for databases to assume that one email address equals one person, but in reality many of us will have multiple emails (for home, work, etc.) and some share a single email address ('family_robinson...' etc.) It can be annoying for customers to receive the same message more than once, so it's good practice to get to grips with multiple emails and organize your comms accordingly.


But let's not feel too disheartened - it's true that database marketing can go wrong, but getting it right isn't rocket science. It's just a question of giving proper attention to data quality, establishing some form of single customer view, and ensuring you have a clear company-wide comms strategy. With those pieces in place, database marketing can be hugely effective.

Now, you'll have to excuse me. I have some cake to eat. Happy birthday DataSalon!"

Nick Andrews is MD of DataSalon who celebrate their 10th birthday this summer. Watch this video to find out more about them.

Will Russell asks where could new ideas come from?

Will Russell, Business Relationship Manager for Technology at the Royal Society of Chemistry, writes:

"From problem solving to planning business transformation, the human capability of creativity will become even more valuable in a world of exponential change – but how can we maximise our own creativity?

Have you ever been in a brainstorm and seen the same ideas coming up? 

What if things could be different and using simple techniques you could unlock truly novel ideas with fewer people in less time?  And not just unlock news ideas – inspire individuals to take ownership to take the ideas forward through validation to development.

I believe anyone can be creative and innovative, and there are tools and frameworks to increase your chances for success.  Successful creativity is more than just a great idea. It’s making a great idea successful.

There are several factors that can help you shape your creative thinking and planning.  Ideation can ensure you are solving the real underlying challenge or problem and cut through the clutter of ready-made solutions that are in your mind.  Validation can ensure that what you are producing actually is a fit for the market.  Iteration will enable you to revise your products based on user feedback, this is even more important in a world where we need to be developing challenges to tomorrow’s problems. On top of all of these there are learnings that can be applied from industries that have been disrupted, and those that have disrupted. 

There are many techniques that David Smith and I will talk about on our upcoming ALPSP course. We are keen that delegates feel enabled, with a toolkit to empower future opportunities – one of which is the five day sprint – enabling them to make business decisions in a short timescale.

A challenge we face today is that, with shorter product lifetimes, we need to predict what challenges our customers will face in the future that our products will need to solve.

I first met David Smith co-tutoring on the ALPSP web 2.0 course (taking over from Leigh Dodds). That course, although relevant in the early days of the social web, ran its course until the social web became standard.  As recently highlighted by Emma Watkins in her excellent ALPSP blog on leveraging social media, it's 10 years since the social web really started to change the digital landscape, and it's hard now to imagine a time without it. So what might the next real disruption on that scale be?  Futurist Gerd Leonhard has produced an excellent video on Digital Transformation.

I've had several different roles whilst working at the Royal Society of Chemistry, working in Technology, Publishing and Innovation, and I have recently returned to Technology.  The change in roles has enabled me to build up a varied experience that I am excited to share with David on the course, from ideation through to validation and moving to development."


Will Russell is co-tutor on the new Disruption, Innovation and Creativity training course alongside David Smith from The IET. Further details and booking on the ALPSP website.

Read David's post on Successful organizations and the creative process.

Friday, 1 July 2016

In a turbulent world, this is why I love the #alpspawards

Winners of the 2015 ALPSP Awards for Innovation in Publishing


















It's that time of year again. We gather together a panel of experts in a dark room in the bowels of a building, and don't let them out until they have considered, debated, and scored some of the best innovations in the scholarly publishing world.

I love this moment; the point at which we announce the shortlist. While there's disappointment for those who didn't make it (and trust me, it was a close run thing, the standard was high) the excitement and anticipation of who might win ratchets up a level.

For those on the shortlist, the work has only just begun. A face to face presentation with the judges awaits. With 15 minutes each to wow, amaze and convince, they'll be preparing and perfecting their pitches. And then there's the lightning sessions at the Conference. (What do you mean you haven't booked yet? Never mind, here's the link.)

Perhaps the best part is the public debate the shortlist creates. Go on, admit it, you've got your favourite. That's OK. Some whooping and cheering from the sidelines is what the shortlisters need. And there really is something for everyone. The range, scale and quality is quite breathtaking. The full shortlist is below. Take a look. Pick your favourite. Set up an office sweepstake.

The world is a challenging place right now. I personally take great comfort in the dedication and hardwork of colleagues in scholarly communications. They are striving to improve tools for - and access to - research for a global community of researchers and beyond.

And have a care for our poor judges, locked away, deliberating. They won't have an easy decision. It'll be one hell of a ride. We hope you'll join us for it.

Follow #alpspawards and #alpsp16 for updates. The shortlisted entries for the 2016 ALPSP Awards for Innovation in Publishing are:

An Adventure in Statistics: The Reality Enigma from SAGE Publishing

Traditional methods of teaching and learning are in flux, partly because attention in the digital age is a scarce resource and engagement is ever harder to create. With the scholarly community demanding more, the nature of the transaction between material and student has changed. Coupled alongside a drive in academia to bridge the UK’s quantitative skills gap, a shakeup both in teaching and focus on research methods has been founded. From this, the concept of the latest Andy Field textbook was born – teaching students statistics through a science fiction love story with graphic illustrations. The project rethinks the way that knowledge can be disseminated – embedding theoretical approaches into a narrative to engage the mind of the reader. In a medium, love-story science fiction, not explored within teaching before, SAGE and Andy have taken a creative approach to better understand the needs of and engage students in teaching and learning.

Cartoon Abstracts from Taylor & Francis

Cartoon Abstracts are a fun new way of visualising academic research. They act as a marketing tool, and are making a big impact on social media as well as having other applications. Each individual cartoon abstract summarises the original authors' work through illustration, harnessing the overwhelming power of images over text. Illustrations can aid the understanding of difficult concepts, or broaden the appeal of niche topics. They can also help transcend language barriers, where that is an issue. Authors enjoy being included as characters, and this encourages them to share their cartoon via their networks – increasing communications reach. The author characters also enhance engagement with the audience.

The Crossref Metadata API

The Crossref Metadata API lets anyone search, filter, facet and sample Crossref metadata related to over 80 million content items with unique Digital Object Identifiers (DOIs). It's free to use, the code is publically available and end-users can do whatever they want with the data. In exposing the authoritative cross-publisher metadata to the community in this way, it becomes more accessible, functional and much simpler to integrate with third party systems and services (from the publisher and the end-user side). It provides smoother workflows and increased discoverability using existing publisher processes.

Knowledge Unlatched

Knowledge Unlatched (KU) is a quiet innovation with revolutionary potential for not only changing the way the publishing costs of scholarly output are financed but also radically bringing down costs to those who fund it. The KU model is the only one that takes into account the global nature of scholarship and the globalisation of publishing. Because it mirrors these two worlds that are inextricably interwoven it avoids many issues associated with other programmes that serve national or institutional priorities. The service has found a way of making the publishing of specialist long-form content sustainable in a world where monographs, especially, are under severe pressure.

ORCID

ORCID's vision is a world in which all who contribute to research, scholarship, and innovation are uniquely identified and connected with their contributions and affiliations across disciplines, borders, and time. We maintain an open Registry where individuals may obtain a unique and persistent identifier (an iD) - a lifelong digital name they control - and services for the community to collect and connect these iDs in research workflows. Individuals may use their iD through their entire career, to ensure that they are reliably connected with their contributions and affiliations, even if they change their name, organization, discipline, or country.

Wiley ChemPlanner

The global pharmaceutical industry continually develops new drugs to cure or improve the treatment of disease. The drug creation process is extremely challenging; it takes an average of 12 years and billions of dollars of investment for one new drug to make it all the way from the lab bench to approval and into the clinic. Wiley ChemPlanner combines state-of-the-art cheminformatics technology with high-quality data to speed up the early stages of the drug creation process, saving  pharmaceutical corporations millions of dollars and getting drugs to patients faster. ChemPlanner lowers the barrier for synthesizing new molecules, thus accelerating the discovery process and allowing the exploration of an expanded region of chemical space. ChemPlanner also enables chemists to optimize synthetic routes, eliminating potentially harmful contaminating side products  and reducing manufacturing costs.


Suzanne Kavanagh is Director of Marketing & Membership Services at ALPSP.


Wednesday, 29 June 2016

Can you hear me now? Ongoing conversations with the “researcher of the future”

Lettie Conrad, Executive Program Manager for Discovery & Access at SAGE Publishing reflects on the recent early careers researcher seminar.

"A lively full-day ALPSP seminar in London last month featured a most productive knowledge exchange among early-career researchers, publishers, librarians, and other experts in scholarly communication. Our focus was to raise awareness among information providers about the experiences and needs of today’s researcher – and we gathered a packed roomful of engaged and eager participants to hear from a panel of doctoral researchers and students.

We heard about their frustrations with peer review, their thoughts about open access, and the ways in which faculty play a starring role in shaping their publication and career decisions. We then heard about how librarians and publishers are working to integrate an understanding of the researcher experience (RX) into their innovative solutions and programs.

But, Dear Reader, we managed to achieve something else that we hadn’t expected. The researchers came away with their own lessons and insights into the realities of today’s information provider! 

What a bright light to see such excitement from scholars at being asked for their input and realizing the ways in which we symbiotically need one another along the supply chain of academic publishing and research! What a refreshingly collaborative and solutions-oriented response to such a stimulating event!

These insights punctuate the importance of publishers and libraries being vocal and eloquent and proactive about communicating our value within the research workflow and broader scholarly enterprise, in everything we do, great and small. Let this serve as a call to each of us actively engaging on a routine basis with those academics who want to maintain an open dialogue about scholarly communications.

And this type of discussion and collaboration represent a growing trend within scholarly communication community – from joint research efforts, events geared toward education and open conversation, user-centered design projects, and longitudinal studies. In part, these efforts are answering the call for greater cooperation across the academic supply chain and greater sensitivity to the user experience.

This ALPSP seminar gives me hope that a collaborative movement is well underway and includes a deeper understanding of the experiences of librarians and publishers too."


Lettie Conrad chaired the seminar Are you ready for the Researcher of the Future? Understanding the researcher experience in London last month. You can follow her on Twitter via @lyconrad.

Successful organizations and the creative process

David Smith, The IET's Head of Product Solutions, writes:

"I cut my teeth in this business, under the original scholarly start-up environment of the legendary Vitek Tracz and his various ‘crazy ideas’ (that he generally managed to sell to the traditional publishers and thus make his return). Late 90s and early 2Ks. It was a wild ride.

Looking back over 15+ years, it’s fascinating to see what has changed and what hasn’t. In our world, we’ve ridden the wave. We digitised our back catalogues, the subscription business model still works well, the OA charging model is humming along nicely. We are not the Newspapers, or the Recording Artists, or the Bookstores and the Record Shops; The High Streets and Main Streets.

Yet we have challenges; the ennui that accompanies the knowledge that our money makers are all very mature things indeed. The knowledge, that despite the above, the networked world has not been kind to other mature businesses. The people who pay for our services are not the people who use them, day to day. We don’t have the luxury of a signal from the user that can be measured by credit card transactions. It’s very hard to connect a piece of new functionality to an increase in ROI. And a new product? Well, it’s probably fighting for existing money, from another product somewhere. New markets, proper new markets, are hard things to reach in our world, and they have fundamentally different environmental parameters.

And the way we are set up as organizations can also be challenging. Mature successful long lasting organizations (many of whom measure their existence in centuries!) have survived by optimising themselves to do what they do, day to day, very effectively. 

The new new thing can be and often is, an existential challenge. Will and I experienced that cognitive dissonance many times with attendees of our (ever evolving) Web 2.0 course.

Like Will, I also help my organization work out what things to focus on and how to best deliver them. And I ‘have people’ who then get to work with the engineering needed for the products to come to life. I’m increasingly fascinated by the processes that successful product shippers use. Iteration; rigorous analytics; unity of purpose; cross functional team building; horizon scanning and rapid delivery and more.

Because one thing is true; the successful organizations, the ones that ‘disrupt’ the old guard, are the ones that have figured out an end-to-end creative process that enables them to outflank their competition.

We will be using the twitter hashtag #alpspcreate to share interesting links on the run up and after the course, please do join the conversation."

David Smith is co-tutor on the new Disruption, Innovation and Creativity training course alongside Will Russell from the Royal Society of Chemistry. Further details and booking on the ALPSP website.

Read Will Russell's post he asks where could new ideas come from?
SaveSave

Tuesday, 3 May 2016

Democratizing eBook Publishing: The rise and rise of e-publishing through the cloud

Copyright: RedKoala

We spoke to Sabine Guerry, founder of 123 Library, about the rise of e-publishing through the cloud, and why publishers should consider this approach.

For those that are approaching this topic for the first time, can you explain what e-publishing through the cloud is about?

Cloud-based systems, or Software as a Service (SaasS) as they are also known, are a way of combining proprietary data and shared software storage. For publishers looking for solutions to deliver their content to their customers, they provide access to hardware, software and maintenance on a licensed basis, without having to invest in setting up and managing their own in-house system.

As eBook sales have gradually replaced print sales, aggregators have proliferated offering various distributions models. This has often resulted in smaller to medium sized and specialist publishers being overlooked, often hardly visible on aggregator platforms with half a million titles. Cloud publishing is changing that offering since a broader range of options over delivery as well as control over sales effectively democratises e-publishing. In its simplest terms, it harnesses the potential of off-site data management service providers to open up possibilities requiring minimal upfront capital expenditure.

What does this mean for a publisher’s output?

It provides another mean for publishers to deliver their eBooks and can open new sales channels by allowing them to build their own delivery website without enduring a huge investment. Cloud Publishing offers you to plug into existing tried and tested systems that offer the latest functionalities for the end users. By using a cloud-based service, you can more easily offer access to your content direct rather than being solely reliant on aggregators. It puts control of your content distribution back into your hands. For academic publishers Cloud Publishing platforms can cater for eBook delivery to both individual users and institutions, including to the most demanding academic institutions that will require an array of technical tools along with the content.

What other features can it provide?

Some customisation is usually available in cloud based systems meaning you can change and adapt it for your list and your market in a timely and responsive way. Cloud based systems also tend to include cross device capability and include enhanced search and research tools that improve the user experience. Areas such as, the provision of an online eReader, soft and hard DRM security, bibliographic reference integration, management tools, compatibility with mobile devices, cataloguing, COUNTER usage statistics, content management and collections creation, search tools, integration with Library management software, transaction creation and business model creation will be handled by the system.

How does it usually work if you decide to work with a cloud based solution?

Cloud publishing starts with a set of tools for linking easy-to-use software applications to your website – called an API (application programme interface). The API allows publishers to create a bespoke, standalone content delivery website, but it can also be used to power an existing one. The content can be eBooks but also e-chapters as long as they can be identified properly.

Why would you recommend users consider this approach?

Cloud services work particularly well for smaller organizations. They don’t require a team of in-house developers working on bespoke software. They are an ‘off the shelf’ tool with simple link to the publisher’s website and easier maintenance. The cloud company has already undertaken the expense and risk of developing the software, which is then ‘shared’ amongst their customers, together with technical maintenance. Crucially, it allows you to punch above your weight and provide at minimum cost direct eBook services equal if not better to those of larger publishers, thus opening up crucial new sales channels and opportunities for the future.


Sabine is Director and Founder of 123Library, an eBook B2B delivery tool for publishers. She is an entrepreneur who specializes in developing IT services for the publishing industry. 123Library’s CloudPublish™ platform provides a range of business models and management tools for both end-users and librarians, and complies with academic institutions' technical requirements. www.123library.org.

Monday, 18 May 2015

High Value Content: Big Data Meets Mega Text

ALPSP recently updated the Text and Data Mining Member Briefing (member login required). As part of the update, Roy Kaufman, Managing Director of New Ventures at Copyright Clearance Center, provided an overview of the potential of TDM, outlined below.

"Big data may be making headlines, but numbers don’t always tell the whole story. Experts estimate that at least 80 percent of all data in any organization—not to mention in the World Wide Web at large— is what’s known as unstructured data. Examples include email, blogs, journals, Power Point presentations, and social media, all of which are primarily made up of text. It’s no surprise, then, that data mining, the computerized process of identifying relationships in huge sets of numbers to uncover new information, is rapidly morphing into text and data mining (TDM), which is creating novel uses for old- fashioned content and bringing new value to it. Why? Text-based resources like news feeds or scientific journals provide crucial information that can guide predictions about whether the stock market will rise or fall, can gauge consumers’ feelings about a particular product or company, or can uncover connections between various protein interactions that lead to the development of a new drug.

For example, a 2010 study at Indiana University in Bloomington found a correlation between the overall mood of the 500 million tweets released on a given day and the trending of the Dow Jones Industrial Average. Specifically, measurements of the collective public mood derived from millions of tweets predicted the rise and fall of the Dow Jones Industrial Average up to a week in advance with an accuracy approaching 90 percent, according to study author Johan Bollen, Ph.D., an associate professor in the School of Informatics and Computing. At the time, Dr. Bollen predicted, with uncanny accuracy, where he felt TDM was going, from the imprecise, quirky world of Facebook and Twitter to high-value content. He said, "We are hopeful to find equal or better improvements for more sophisticated market models that may in fact include other information derived from news sources and a variety of relevant economic indicators."

In other words, structured data alone is not enough, nor is text mined from the wilds of social media. Wall Street and marketers, eager to predict the right moment to hit buy or sell or to launch an ad campaign, have already moved from mining Facebook and Twitter to licensing high-value content, such as raw newsfeeds from Thomson Reuters and the Associated Press, as well as scientific journal articles reformatted in machine- readable XML. In fact, a 2014 study by Seth Grimes of Alta Plana concludes that the text mining market already exceeds 2 billion dollars per year, with a CAGR of at least 25%.

Far from being irrelevant in our digital age, high-value content is about to have its moment, and not just to improve the odds in the financial world or help marketers sell soap. It represents a new revenue stream for publishers and their thousands of scientific journals as well. For example, in 2003, immunologist Marc Weeber and his associates used text mining tools to search for scientific papers on thalidomide and then targeted those papers that contained concepts related to immunology. They ultimately discovered three possible new uses for the banned drug. “Type in thalidomide and you get between 2,000 and 3,000 hits. Type in disease and you get 40,000 hits,” writes Weeber in his report in the Journal of the American Medical Informatics Association. “With automated text mining tools, we only had to read 100-200 abstracts and 20 or 30 full papers to create viable hypotheses that others could follow up on, saving countless steps and years of research.”

The potential of computer-generated, text-driven insight is only increasing. In his 2014 TedX Talk, Charles Stryker, CEO of the Venture Development Center, points out that the average oncologist, after scouring journals the usual way, reading them one by one, might be able to keep track of six or eight similar cancer cases at a time, recalling details that might help him or her go back, re-read one of two of those articles, and determine the best course of care for a patient with an intractable cancer. The data banks of the two major cancer institutes, on the other hand, hold searchable records of cancer cases that can be reviewed in conjunction with 3 billion DNA base pairs and 20,000 genes contained within each. So using that data would mean a vast improvement in the odds of finding clues to help treat a tricky case or target the best clinical trial for someone with a rare disease. This information might otherwise have been difficult, if not impossible, for even the most plugged-in oncologist to find, let alone read, see patterns, or retain the information for a period of time.

Think, then, of the possibilities of improving healthcare outcomes if the best biomedical research were aggregated in just a few, easily accessible repositories. That’s about to happen. My employer, Copyright Clearance Center (CCC), is coming to market with a new service designed to make it easier to mine high-value journal content. Scientific, technical and medical publishers are opting into the program, and CCC will aggregate and license content to users in XML for text mining. Although the service has not yet fully launched, CCC already has publishers representing thousands of journals and millions of articles participating.

Consider the difficulties of researchers, doctors, or pharmaceutical companies wishing to use text mining to see if cancer patients on a certain diabetes drug might have a better outcome than patients not on the drug. They must go to each publisher, negotiate a price for the rights, get a feed of the journals, and convert that feed into a single useable format. If the top 20 companies did this with the top 20 publishers, it would take 400 agreements, 400 feeds, and 400 XML conversions. The effort would be overwhelming.

Instead, envision a world where users can avail themselves of an aggregate of all relevant journals in their field of interest. Instead of 400 agreements and feeds to navigate and instead of 400 documents to convert to XML, there would be maybe 40 agreements: 20 between the publishers and CCC and 20 with users. There would be no need for customers to convert the text. In other words, researchers could get their hands on the high-value information they need to move research and healthcare forward, in less time, with less effort. And that’s only the beginning. As Stryker said about the promise of TDM, “We are in the first inning of a nine-inning game. It’s all coming together at this moment in time.”

ALPSP Members can login to the website to view the Briefing here.

Roy Kaufman is Managing Director of New Ventures at the Copyright Clearance Center. He is responsible for expanding service capabilities as CCC moves into new markets and services. Prior to CCC, Kaufman served as Legal Director, Wiley-Blackwell, John Wiley and Sons, Inc. He is a member of the Bar of the State of New York and a member of, among other things, the Copyright Committee of the International Association of Scientific Technical and Medical Publishers and the UK's Gold Open Access Infrastructure Program. He formerly chaired the legal working group of CrossRef, which he helped to form, and also worked on the launch of ORCID. He has lectured extensively on the subjects of copyright, licensing, new media, artists' rights, and art law. Roy is Editor-in-Chief of ‘Art Law Handbook: From Antiquities to the Internet’ and author of two books on publishing contract law. He is a graduate of Brandeis University and Columbia Law School.