Colin Meddings is the Client Director at DataSalon. Colin will be one of the speakers at the forthcoming ALPSP seminar Data, the universe and everything taking place in January.
Here, in a guest post, he reflects on why good quality customer and internal data is important for scholarly publishers.
'Only four types of organisations need to worry about data quality: Those that care about their customers; Those that care about profit and loss; Those that care about their employees; and Those that care about their futures.' – Thomas C. Redman (2006)
Over recent years publishers have had to overcome many hurdles in the digital world, such as making content available online, managing complex consortia deals, creating new packages of content and tracking usage statistics. The result of all this digital activity is vast amounts of data. However, the pace of change can often distract from the careful governance of this data, leading to gaps, inconsistencies and inaccuracies.
But why does the quality of all this data matter so much? Good data is your most valuable asset, and bad data can seriously harm your business and credibility…
What have you missed?
At a management level, poor data quality equates directly to poor visibility of key trends in the growth or decline of certain products or markets. At the contact level, you may miss out on valuable sales opportunities if email address fields aren’t filled out correctly or customer names are wrong. Having good data will help deliver better customer service and enhance your reputation, and it means you can make better selections for targeted prospecting, cross-selling and up-selling.
When things go wrong.
Bad data can lead to ‘accidents’ and wrong decisions or actions which can affect customer confidence. You’ve spent time building up a valuable customer list – so it’s important not to waste this by sending campaigns to the wrong people, or with messages which don’t match their interests, or to out-of-date or deceased contacts. Data quality issues can also cost you money directly – for example if invoices or renewal notices are sent to the wrong recipient, or at the wrong time.
Making confident decisions.
Data quality matters most of all because it enables your staff and management team to really trust the accuracy of the reports and analysis they’re given. Without that confidence, apparent trends or new opportunities will always leave you wondering whether they really present a true picture. But with a complete and accurate view of your customers and prospects, comes the confidence to make well informed business decisions and commit fully to your strategic planning.
So, data quality is a very important foundation for a publisher’s entire business planning process and customer contact strategy. Good data quality will allow your business and its reputation to grow and flourish.
Data quality is just one of the topics in the forthcoming ALPSP seminar Data, the universe and everything. Other areas covered will include the use of institutional and personal identifiers in the scholarly publishing supply chain, publisher metadata, data relating to open access publishing and some case studies from publishers who have tackled data issues.
This post originally appeared on DataSalon’s own blog From the Armchair.
Monday, 16 December 2013
Colin Meddings: Why data quality matters.
Labels:
#alpspdata,
#data,
alpsp,
Colin Meddings,
data,
DataSalon
Wednesday, 4 December 2013
Frank Stein on Watson and the Journey to Cognitive Computing
Frank Stein on cognitive computing |
The Watson and Jeopardy! example shows how they have developed a programme that can match deeper evidence and use temporal reasoning, statistical paraphrasing and geospatial reasoning. The evidence is still not 100% certain, but it is about about likelihood and confidence.
What they learned in Jeopardy
The DeepQA approach can accurately answer single sentence queries with confidence and speed. It is highly dependent on content, content quality, and content formats. They need a combination of technologies to get satisfactory performance (semantic technology, machine learning, information retrieval/search technology, databases and high performance computing techniques). Both structured and unstructured content need to be combined for best results. They now need to extend Watson to handle richer interactions and continuous training/learning.
Here's the IBM video about Watson and the game show Jeopardy!
Watson Decision Advisor in medicine
A data-rich, societally important field helping Watson change how medicine is:
- Taught and researched (Major Cancer Center, students learn from and 'teach' Watson at the Cleveland Clinic)
- Practiced (Memorial Sloan-Kettering, Community Cancer Care Centers)
- Paid (WellPoint & Utilization Management)
IBM used to produce typewriters
When Stein started, IBM produced typewriters. Now they have 10,000+ products. Their sales agents need help. IBM is building out a portfolio of Watson Solutions including Watson Engagement Advisor for use in situations in which you need stronger ties with constituents and better automated or agent-facilitated conversations. Examples include: bank outreach to customers for cross-sell, cable operator services and support, tax agency advice, etc.
When Stein started, IBM produced typewriters. Now they have 10,000+ products. Their sales agents need help. IBM is building out a portfolio of Watson Solutions including Watson Engagement Advisor for use in situations in which you need stronger ties with constituents and better automated or agent-facilitated conversations. Examples include: bank outreach to customers for cross-sell, cable operator services and support, tax agency advice, etc.
What's next - Cognitive Computing
Watson is ushering in a new era of computing. We have transitioned from the tabulating systems era to programmable systems era. Now we are moving into a world called cognitive systems era. This is a key technology for a new era of computing that takes into account:
- Content and learning
- Visual analytics and interaction
- Data centric systems
- Cognitive architecture
- Atomic and nano-scale.
Labels:
#ukinno,
alpsp,
data,
Frank Stein,
IBM,
Jeopardy!,
STM Innovations,
Watson
Sayeed Choudhury reflects on the research data revolution
Sayeed Choudhury |
There is a new economy of sources of data. The challenge as publishers is to develop services.
Data Conservancy is a community that develops solutions for data preservation and sharing to promote cross-disciplinary re-use. It is about preservation - collect and take care of research data; sharing - reveal data's potential and possibilities; and discovery - promote re-use and new combinations.
Is data different?
Data is the new oil (stated in Qatar, European Commission, etc). McKinsey claimed that data is 4th factor of production and estimates a potential $3 trillion of economic value across seven sectors within the US alone. Todd Park estimates location sensitive apps generate $90 billion of value annually. Policy movements reflect its importance: the White House Office of Science & Technology Policy Executive memorandum and White House Open Government Initiative are two key initiatives.
Collections
Data are a new form of collections though they are fundamentally different in nature. They are created or converted to digital format for processing by machines. Entirely new methods are required to deal with them. They are, in effect, a new form of special collections.
Do librarians and indeed publishers have the skill set to really grapple with data? data architects/scientists/modellers/visualisers #ukinno
— David Smith (@drs1969) December 4, 2013
What is 'Big Data'?There are definitions based on the V's of Big Data (e.g. volume, velocity, variety). What is clear is that it's different from 'spreadsheet science' (or long-tail science). For Choudhury, if a community's ability to deal with data is overwhelmed, it is 'Big Data' - and it's more about 'M's' (methods of lack thereof) than 'V's'.
Services
There's a core of services that span across data from different disciplines and contexts. Archiving is a good example. However, if data collections are basically open, libraries may need to differentiate themselves by the services they offer. They should provide a combination of machine and human mediated services. There will be a set of services that only 'experts' will be able to offer.
Data management layers: curation, preservation, archiving, storage |
Understanding infrastructure
Data will require fundamentally new systems and infrastructure. Institutional repositories can be useful gateways, but are not long-term solutions (particularly for 'Big Data'). Libraries will need to operate at scale through an integrated, ecosystem approach to infrastructure. Customised 'human mediated' services are most effective as an interpretative layer on machine based services.
What about publishers?
No one can claim a specific role or act with a sense of entitlement when it comes to data (whether publishers or librarians). The future of data curation is a competition between information graphs. 'Publishing is about content, not format.' - Wendy Queen, Associate Director of Project Muse, Johns Hopkins University Press
Labels:
#ukinno,
alpsp,
data,
Johns Hopkins University,
Sayeed Choudhury,
STM Innovations
Monday, 2 December 2013
International Publishers Association Call for Nominations: 2014 IPA Freedom to Publish Prize
The closing date for nominations for the 2014 IPA Freedom to Publish Prize is 6 January 2014.
The Prize will be awarded on 27 March 2014, during the IPA Congress in Bangkok, and the recipient will receive CHF20,000, thanks to the generous sponsorship of the following publishers: Albert Bonniers Förlag, Elsevier, HarperCollins, Kodansha, Macmillan, OUP, Penguin Random House, and Simon & Schuster.
Nominees can either be publishers who have recently published controversial works in the face of pressure, threats, intimidation or harassment from government or other authorities; or publishers with a long and distinguished history of upholding the values of freedom to publish and freedom of expression.
IPA member organisations, members of the IPA Freedom to Publish Committee, individual publishers, and international professional and non-government organisations working in the field of freedom of expression can nominate candidates for the IPA Freedom to Publish Prize.
Those nominating must explain the reasons behind their choice of candidate in writing (in English, French or Spanish) using the attached form as a template. Nominations should be submitted to the IPA’s Policy Director, José Borghino (borghino@internationalpublishers.org) no later than close-of-business (Geneva time) on 6 January 2014.
More about the 30th IPA Congress and the IPA Freedom to Publish Prize Ceremony:
The 30th IPA Congress will be held in Bangkok, Thailand, on 25-27 March 2014, and will be hosted by the Publishers and Booksellers Association of Thailand (PUBAT) under the auspices of HRH Princess Maha Chakri Sirindhorn. To see the Program, go to the Congress website.
On the eve of the Bangkok Book Fair (28 March to 8 April), hundreds of publishers from all over the world will participate in the Congress, together with authors, copyright specialists, librarians and officials from around 50 countries and international organisations.
The 2014 IPA Freedom to Publish Prize will be awarded during the Congress on 27 March 2014. Aung San Suu Kyi has been invited to give the keynote speech and award the Prize.
Earlybird online registration for the Congress is available from the Congress website.
The Prize will be awarded on 27 March 2014, during the IPA Congress in Bangkok, and the recipient will receive CHF20,000, thanks to the generous sponsorship of the following publishers: Albert Bonniers Förlag, Elsevier, HarperCollins, Kodansha, Macmillan, OUP, Penguin Random House, and Simon & Schuster.
Nominees can either be publishers who have recently published controversial works in the face of pressure, threats, intimidation or harassment from government or other authorities; or publishers with a long and distinguished history of upholding the values of freedom to publish and freedom of expression.
IPA member organisations, members of the IPA Freedom to Publish Committee, individual publishers, and international professional and non-government organisations working in the field of freedom of expression can nominate candidates for the IPA Freedom to Publish Prize.
Those nominating must explain the reasons behind their choice of candidate in writing (in English, French or Spanish) using the attached form as a template. Nominations should be submitted to the IPA’s Policy Director, José Borghino (borghino@internationalpublishers.org) no later than close-of-business (Geneva time) on 6 January 2014.
More about the 30th IPA Congress and the IPA Freedom to Publish Prize Ceremony:
The 30th IPA Congress will be held in Bangkok, Thailand, on 25-27 March 2014, and will be hosted by the Publishers and Booksellers Association of Thailand (PUBAT) under the auspices of HRH Princess Maha Chakri Sirindhorn. To see the Program, go to the Congress website.
On the eve of the Bangkok Book Fair (28 March to 8 April), hundreds of publishers from all over the world will participate in the Congress, together with authors, copyright specialists, librarians and officials from around 50 countries and international organisations.
The 2014 IPA Freedom to Publish Prize will be awarded during the Congress on 27 March 2014. Aung San Suu Kyi has been invited to give the keynote speech and award the Prize.
Earlybird online registration for the Congress is available from the Congress website.
Friday, 15 November 2013
'Scientifically sound' What does that mean in peer review? Cameron Neylon asks...
Cameron Neylon, Director of Advocacy at the Public Library of Science, challenged the audience at The Future of Peer Review seminar.
He suggested that if we're going to be serious about science, we should be serious about applying the tools of science to what we (the publishers) do.
What do we mean when we say 'scientifically sound'? Science works most of the time, so we tend not to question it because of this. Should we review for soundness, as a pose to reviewing for soundness and importance.
How can we construct a review process that is scientifically sound? The first thing you would do in a scientific process is to look at the evidence, but Neylon believes the evidence is almost totally lacking for peer review. There are very few good studies. Those that exist show frightening results.
We need to ask questions about the costs and the benefits. Rubriq calculated there are 15 million hours of lost time reviewing papers that were rejected in 2012. (This video post illustrates the issues they raise). This is equivalent to around $900m if you calculate reviewers' time. How can we tell this is benefitting science? We need to decide whether we would be better spending that money and time on doing more research or improving the process.
Neylon asked what would the science look like to assess the effectiveness of peer review? There are some hard questions to ask. We'd need very large data sets and interventions including randomised control trials. But there are other methods that can be applied if data is available. Getting data about the process that you are confident with is at the heart of problem.
The obvious thing is to build a publishing system for the web. Disk space is pretty cheap and bandwidth can be managed. Measure what can be measured. Reviewers are pretty good at checking technical validity of papers. Importance is more nebulous. Taking this approach, Neylon believes that you end up with something that looks like PLOS One.
The growth curve for PLOS One is steep as it tackles these issues. In addition to this growth trajectory, 85% of papers are cited after 2 years: well above average for STM literature. There still remains a challenge of delivering that content to the right people at the right time. Technical validity depends on technical checks. PLOS One has six pages of questions to be answered before it goes to an editor. How much could we validate computationally? Where are computers better than people?
What has changed since similar talks 5 years ago? New approaches that were being discussed then are happening now (e.g. Rubriq). The outside world is much more sceptical about what happens with public funding. According to Neylon, one thing is for sure when it comes to peer review. The sciences need the science.
These notes were compiled from a talk given by Cameron Neylon at ALPSP's The Future of Peer Review seminar (London, November 2013) under CC-BY attribution. Previous presentations by Cameron can be found on his SlideShare page.
He suggested that if we're going to be serious about science, we should be serious about applying the tools of science to what we (the publishers) do.
What do we mean when we say 'scientifically sound'? Science works most of the time, so we tend not to question it because of this. Should we review for soundness, as a pose to reviewing for soundness and importance.
How can we construct a review process that is scientifically sound? The first thing you would do in a scientific process is to look at the evidence, but Neylon believes the evidence is almost totally lacking for peer review. There are very few good studies. Those that exist show frightening results.
We need to ask questions about the costs and the benefits. Rubriq calculated there are 15 million hours of lost time reviewing papers that were rejected in 2012. (This video post illustrates the issues they raise). This is equivalent to around $900m if you calculate reviewers' time. How can we tell this is benefitting science? We need to decide whether we would be better spending that money and time on doing more research or improving the process.
Neylon asked what would the science look like to assess the effectiveness of peer review? There are some hard questions to ask. We'd need very large data sets and interventions including randomised control trials. But there are other methods that can be applied if data is available. Getting data about the process that you are confident with is at the heart of problem.
The obvious thing is to build a publishing system for the web. Disk space is pretty cheap and bandwidth can be managed. Measure what can be measured. Reviewers are pretty good at checking technical validity of papers. Importance is more nebulous. Taking this approach, Neylon believes that you end up with something that looks like PLOS One.
The growth curve for PLOS One is steep as it tackles these issues. In addition to this growth trajectory, 85% of papers are cited after 2 years: well above average for STM literature. There still remains a challenge of delivering that content to the right people at the right time. Technical validity depends on technical checks. PLOS One has six pages of questions to be answered before it goes to an editor. How much could we validate computationally? Where are computers better than people?
What has changed since similar talks 5 years ago? New approaches that were being discussed then are happening now (e.g. Rubriq). The outside world is much more sceptical about what happens with public funding. According to Neylon, one thing is for sure when it comes to peer review. The sciences need the science.
These notes were compiled from a talk given by Cameron Neylon at ALPSP's The Future of Peer Review seminar (London, November 2013) under CC-BY attribution. Previous presentations by Cameron can be found on his SlideShare page.
Labels:
#alpsppeer,
alpsp,
cameron neylon,
Future of Peer Review,
peer review,
PLoS,
Public Library of Science
Wednesday, 13 November 2013
Ulrich Pöschl on advancing post-publication and public peer review
Ulrich Pöschl |
Ulrich Pöschl is based at the Max Planck Institute for Chemistry and is professor at the Johannes Gutenberg University in Mainz, Germany.
He initiated interactive open access publishing with public peer review and interactive discussion through the journal Atmospheric Chemistry and Physics and the European Geosciences Union.
In his talk at The Future of Peer Review seminar, he presented a vision of promotion of scientific and societal progress by open access and collaborative review in global information commons; access to high quality scientific publications (more and better information for scientists and society); documentation of scientific discussion (evidence of controversial opinions and open questions); and demonstration of transparency and rationalism (role model for political decision process).
Pöschl believes the most important motivation of open access is to improve scientific quality assurance. Why is it not a threat to peer review? Traditional peer review is fully compatible with open access. Information for reviewers is strongly enhanced by open access. Collaborative and post-publication peer review can be fully enabled by open access. Predatory open access publishers and hoaxes are a side-issue: a transition problem and red herring (partly caused by the vacuum created by the slow move of traditional publishers).
Pöschl went on to outline a range of problems that affect peer review. Quality assurance can be an issue with manuscripts and publications often carelessly prepared and faulty. The tip of the iceberg can be fraud. Common practice can lead to carelessness. Consequences can be waste and misallocation of resources.
Editors and referees may have limited capacity and/or competence. Traditional pre-publication review can lead to retardation and loss of information. Traditional discussion can be sparse and subject to late commentaries. For Pöschl, he doesn't have time for pure post-publication review (open peer commentary) as he has enough to do with his scientific work.
The dilemma at the heart of peer review is speed versus quality. There are conflicting needs of scientific publishing: rapid publication versus thorough review and discussion. Rapid publication is widely pursued. The answer? A two stage process. Stage 1 involves rapid publication of a discussion paper, public peer review and interactive discussion. Stage 2 comprises review completion and publication of the Final Paper.
The advantages of interactive open access publishing are that it provides an all win situation for the community of authors, referees, editors and readers. The discussion paper is an expression of free speech. Public peer review and interactive discussion provides lots of benefits, but in particular, they foster and document scientific discourse (and save reviewer capacities).
Four stages to interactive open access publishing |
- Pre-publication review and selection
- Public peer review and interactive discussion
- Peer review completion
- Post-publication review and evaluation
This needs to be in combination or integration with repositories, living reviews concept, assessment house concept, ranking system/tiers and article level metrics.
At Atmospheric Chemistry and Physics, the rejection rate is as low as 10%. Submission to publication time is a minimum of 10 days and up to 1 month. The publication charge is 1000 euros and they have up to 50% additional comments pages. The achievements for combining these approaches include top speed, impact and visibility, large volume, low rejection rates and costs. The journal is fully self-financed and sustainable.
Pöschl passionately believes that these stages can be adjusted to other priorities and can therefore work for other disciplines and research communities. Future perspectives to take into account include an efficient and flexible combination of new and traditional forms of review and publication, as well as multiple stages and levels of interactive publishing and commenting.
Labels:
#alpsppeer,
alpsp,
Atmospheric Chemistry and Physics,
Future of Peer Review,
peer review,
Ulrich Pöschl
Tuesday, 12 November 2013
Why is peer review is such an enduring factor in the research process? Mark Ware provides an overview
Mark Ware: why is peer review an enduring factor? |
Peer review is not broken in his mind. It is overwhelmingly supported by researchers, but that doesn't mean it can't be improved. Publishers need to take new developments into account.
Peer review is one part of the publishing cycle as well as the broader research cycle. It is important. Benefits include improving the quality of published articles (and this comes from the research community). It provides filters, seal of approval and a reward system that works for researchers.
However, no human activity is perfect so what are the limitations? They include validity, effectiveness, efficiency, the burden on researchers and fairness. In a world where there's a drive for transparency, we have to take these criticisms seriously. Peer review is ripe for improvement in many areas.
So who's who in peer review?
- Authors (co-authors, contributors) - the definition of authorship has become more formalised in recent years.
- Editors and editorial boards - editor role is crucial, misconception that it is the reviewers that make the decision. This ignores the fact that peer review is a process. Editors use their judgment. The best examples are a constructive discussion with the aim of making the paper the best it can be.
- Reviewers
- Publishers and editorial staff - these roles are often overlooked by those that claim reviewers do all the work. Editorial independence is a reason why we might want this. Peer review process diagram: the danger is that we think this is the system, but actuallyit is one small part of a publishing process.
- Readers (post-publication review).
Peer review flow chart: just one part of process |
Retractions are booming:
In 1977 there was c.2 retractions per 100k publications.
In 2010 this had risen to 50 per 100k publications.
What's driving new approaches?
What are the problems we're trying to solve?
Which problems are we trying to mediate?
What are the opportunities to improve? Fairness and bias, delays, inefficiency, reproducibility, research data and information overload all figure.
Pre-publication innovations include:
- 'Soundness not significance'
- Cascade review
- Portable review
- Open and interactive
- Registered reports.
Post-publication innovation includes:
- Comments and ratings
- Metrics and altmetrics
- Article evaluation and tiering systems (e.g. Frontiers)
- Overlay journals.
Labels:
@mrkwr,
#alpsppeer,
alpsp,
Future of Peer Review,
mark ware,
peer review
The peer review landscape – what do researchers think? Adrian Mulligan reflects on Elsevier's own research.
Adrian Mulligan ponders: what do researchers think? |
What do researchers think? Peer review is slow. Generally in STM journals, average peer review is 5 months. Social sciences can go up to 9 months. From when you submit an article through to final submission. Peer review can be prone to bias and can hold back true innovation (here he cites the Nature articles from October 2003, Coping with peer rejection: Accounts of rejected Nobel-winning discoveries highlight conservatism in science).
There is a small group of people who decide what is happening with a manuscript and they tend to be quite conservative. It's time consuming and redundant and does not improve quality. This view was reflected in The Guardian article by Dr Sylvia McLain 'Not breaking news: many scientific studies are ultimately proved wrong!' 17 September 2013. Jeffrey Brainard reported in The Chronicle in August 2008 'Incompetence tops list of complaints about peer reviewers' on how there are too few qualified overworked reviewers. Recently, The Scientist article 'Fake paper exposes failed peer review' by Kerry Grens (October 6, 2013) highlighted how peer review may not be good at preventing fraud or plagiarism.
Elsevier undertook a research project to find out What do researchers think? From author and reviewer perspective. They surveyed individuals randomly selected from published researchers and received 3,008 responses. Most researchers are satisfied with current peer review system: 70% in 2013 (1% higher than in 2009 and 5% higher than in 2007). The satisfaction level is higher among chemists and mathematicians, and lower among computers scientists, social scientists (inc. arts, humanities, psychologists and economists). Chinese researchers are the most satisfied and there is no difference by age.
Most believe that peer review improves scientific communication. Almost three quarters agreed that the peer review process on unsuccessful submissions improved the article. The amount of researchers who had gone through multiple submissions is relatively low (29% submitted to another journal, article submitted average of 1.6 times before accepted). Few believe peer review is holding back science, but the amount is growing: in 1997 - 19% agreed, in 2009 it was 21%, and in 2013 it had risen to 27%.
Pressure on reviewers is increasing including time and a lack of incentives. Some reviewers lack the specialist knowledge required and it could be argued that too many poor quality papers are sent for review.
Mulligan observed that on a national level, the contribution in terms of submissions should be equal in terms of a country's contribution of reviews. China publishes far fewer papers compared to reviews and the reverse is true for the US. He noted that Chinese researchers are more likely to accept an invitation to review.
Over a third of those who responded believe peer review could be improved. Reviewers need more guidance, researchers are less willing to volunteer their time to conduct reviews and the fact that the system is biased/needs to be completely anonymous should be reviewed. Another challenge for the industry is whether or not peer review can genuinely detect fraud and plagiarism.
More people are shifting to open peer review, however, the preference in North America is for more traditional peer review (single blind and double blind). So what is Elsevier doing? They are reducing the number of reviews - transferring from one journal to the next. They are recognising reviewers' contributions - rewarding reviewers with certificates awards. And they are getting reviewers together to improve speed and quality.
Labels:
#alpsppeer,
Adrian Mulligan,
alpsp,
elsevier,
Future of Peer Review,
peer review,
publishing
Friday, 8 November 2013
Copyright – business or moral right?
Pippa Smart: is copyright a business or moral right? |
"Many years ago I wrote a short “how to get published” guide. Now, I’m not going to pretend it was the best guide ever; I’m sure there are plenty of others (in fact I know there are) that are more succinct, more instructive and more useful to authors. But it was my own work, and I was (quietly) pleased with it. It was downloaded from the company website and – I hope – useful to at least one author, somewhere in the world.
Then I discovered that someone had taken it and reproduced it in a journal. I can’t pretend that I wasn’t flattered, but I was a bit annoyed that my name (and my employer’s) was removed. We wrote to ask for a correction – no reply. So, after a sigh and a shrug of the shoulders we moved on and forgot it – after all, nobody was killed, there was just a little bit of injured pride.
Would we have reacted differently, I wonder if the article had been for sale? Would we have been more concerned if we thought the author benefitted financially rather than just reputationally? Perhaps.
This came to mind recently when a friend of mine had an article she had published in an open access journal posted on a reprints site, being sold for $5. She was furious. She streamed her angst on the airwaves. She named names and pointed fingers. After a few postings reminding her that the CC-BY licence allowed this reprints company to do exactly what they were doing, she calmed a little – then asked her publisher to demand a take-down. The publisher obliged and the reprints site capitulated.
These examples raise several important points. Copyright protection is there to protect authors, not just to make money for big business. And publishers have a duty to help authors protect their rights. Authors care about their content – and may not understand how copyright can protect them, and when it cannot. Add into this mix different legal obligations and cultural expectations, and we live in a complex IPR world.
I forecast more examples like these (copyright and plagiarism) in the next few years. There will be a greater need for publishers to help (and to educate) authors, and a need for them to understand the wider debates about access and the intersection with legal and moral issues. Interesting times."
Pippa is author of ALPSP's eLearning course International Copyright. Take the new online demo for the course and receive up to an hour of free training.
Pippa Smart is a research communication and publishing consultant with over 20 years experience, working for CAB International, Blackwell and Cambridge University Press to name a few. She now researches and writes on publishing issues, runs training courses and runs PSP Consulting. She is the editor of the ALPSP Alert and can be contacted at pippa.smart@gmail.com.
Labels:
#alpsp,
alpsp,
copyright,
eLearning,
International Copyright,
Pippa Smart,
PSP Consulting
Thursday, 7 November 2013
Keeping pace with changes in eJournal technology
Tracy Gardner: keeping pace with eJournal technology |
SK: What is the main challenge that publishers face in the field?
TG: The pace of change within eJournals technology is fast. This technology has removed the barriers between production, editorial, marketing, sales, customer services and most importantly – the customers. Renew Training started running business technology courses specifically for publishers around 7 years ago and during all that time the same course has never been delivered twice!
SK: What is driving the pace of change?
TG: Changes in how libraries authenticate their patrons, how they manage reader navigation and the implementation of new search and discovery tools has changed the eJournal landscape dramatically.
SK: Who does this affect?
TG: For those in sales, marketing and customer service it can be hard to understand the business ramifications of how eJournal technology affects the way librarians and researchers find and access content. How does the fact a library uses a proxy, or only has one IP address for their entire institution, or indeed if their IP addresses are a state secret impact how researchers read your content? Does Shibboleth or Athens solves these issues, or does it create news ones? What about OpenURLs and working with link resolvers – and what are resource discovery services and tools and why should you worry about them?
For those in operational or technology roles, the business technology side of eJournals can seem daunting and especially for those new to the industry, the way the information community works can seems counter to the way many other business sectors operate.
SK: How can you keep pace of these changes?
TG: Educate those in sales, marketing, customer services, product development, editorial, project management and IT in the technologies. These roles are all vital to the delivery of eJournals. You need to clearly position these technologies in the context of the industry issues they aim to solve so your teams understand how they are used throughout the supply chain internally and by librarians through to end users. Understand a) your customers' technical and business requirements, and b) how technology plays a role in discoverability and deploying eJournals.
Tracy Gardner has over 17 years’ experience in marketing and communications and has worked for CatchWord, Ingenta, CABI Publishing and Scholarly Information Strategies. Her career has focussed on improving communication channels between publishers, intermediaries and librarians and she understands the business of scholarly publishing from many different perspectives.
Tracy is co-tutor with Simon Inger on Understanding eJournal Technology course run by ALPSP in association with Renew Training. If you are flummoxed by any of the above terminology, or if you would like to understand more about how your customers are using business technology to serve their patrons, then come along to the next course on 13 November 2013 in Oxford.
Labels:
alpsp,
ALPSP training,
e-journals,
eJournal,
journal,
Renew Training,
simon inger,
technology,
tracy gardner,
Understanding eJournal Technology
Thursday, 31 October 2013
Roy Kaufman: Open Access Doesn't Necessarily Mean Free
"There is a popular opinion in the publishing market - that open access means free. However, the truth is far more complex and dependent on the licensing options a publishers offers. We recently produced a white paper to explain where exactly the costs of open access occur and their impact on scholarly and scientific publishing.
Publishers incur costs
Regardless of the open access model, there are still costs of publication. These include recruiting authors, maintaining the peer review system, print and/or digital production, sales, marketing, tagging and linking articles, as well as archiving and making the "version of record" available.
Some users may still need to pay to reuse content
Publishers use a variety of licenses for open access content. Often, they choose those designed by Creative Commons, which provides a range of nuanced licenses. For example, a CC BY-NC license allows derivative works to be made for non-commercial use without an additional license or fee. However, a permissions fee is required to use an article designated as CC BY-NC for commercial purposes. Ultimately, the terms of commercial reuse may be set by the publisher, the author, the author's institution or the funding agency.
Creative Commons licenses and the impact of funding agencies
Many publishers are currently using – or considering – a range of standard licenses provided by Creative Commons. There are several factors to consider when choosing the license type, including the author's goals, the policy of the publishing society or company, the academic institution and the funding organization. This can influence the terms of the license under which an article is published.
In any case, publishers need to recoup the costs of publication to maintain a going concern. In a traditional publishing paradigm, revenue often originates from subscriptions, single article sales, secondary licensing and more. For many Open Access articles, publishers offset costs by securing article processing charges from funding organizations, authors, and / or academic institutions.
What does this mean for publishers?
Clearly, open access presents new opportunities for publishers to serve their customers and authors, and creates an outlet for the publication of ever-increasing submissions. However, it also generates administrative and business burdens that challenge publishers' typical subscription-based business models. There are opportunities to generate revenue with open access such as license fees for commercial use when an article is covered by a CC BY-NC license, and opportunities to provide other services to authors as well.
When exploring licensing options, publishers should consider the following questions:
- Who typically requests reuse permissions and for what purpose?
- Under which open access model was the article published?
- Are there special funder requirements for the author?
- What are the various revenue components of a journal? How will each model of open access affect them?
- What is the competitive landscape for the journal as compared with open access options offered by competing journals?
Open access policies present challenges and opportunities for publishers to better serve authors as well as consumers of open access content. A strong licensing framework, communicated clearly to authors and the community, is essential to ensure both quality and sustainability.
These thoughts are drawn from a white paper CCC developed earlier this year to help our publisher clients when considering open access. The full version can be downloaded here."
CCC offers ALPSP members a special discount off the RightsLink suite of licensing tools. Review information about their open access solution on their website at www.copyright.com/alpsp.
Roy Kaufman is Managing Director, New Ventures at Copyright Clearance Center (CCC) where, since 2012, he has been responsible for expanding capabilities as the business develops new services for authors, publishers and other rights holders. Prior to CCC, Kaufman served as lead counsel for the Scientific, Technical, Medical and Scholarly publishing business of John Wiley & Sons, Inc., working in all areas of licensing, contracts, strategic alliances and online publishing.
Labels:
],
alpsp,
CCC,
copyright clearance center,
open access,
roy kaufman
Tuesday, 29 October 2013
How do you make your publication stand out from the crowd? Julia Lampam reflects...
Julia Lampam: make your publication stand out |
These questions and more are raised by publicists, marketers, editors and authors alike as we strive to improve the discoverability of the rich academic content we publish. And why? Because we’re all aiming to increase the discoverability and visibility of an article through a number of metrics and citations, with the impact factor being the most widely used.
It's all about discoverability and visibility on an article level
During our forthcoming course, Getting the Most from Journal Publicity, Alexa Dugan and I will share our experience of maximising the promotional and marketing resources available to extend the reach and impact of the research in your publications.
Publicity by its very nature is unpredictable.
But by asking yourselves a pertinent set of questions and armed with a handful of tools, you can develop and instigate a proactive media campaign to draw attention to the research. By taking advantage of social media and online networks, as well as traditional PR resources, article level publicity efforts can help you reach wider audiences, increasing web traffic to the content and thereby potentially improving the number of citations.
What works, and what doesn't?
The evaluation of such initiatives plays just as an important role as planning – knowing what works and what doesn’t will enable you to focus on which articles you can target your limited means. Publicity campaigns take time and resource, with some being more effective than others. In addition, there is an on-going debate about how to measure the return on investment on such activity. You can focus on the number of citations, journal impact factors, web traffic to an article, number of downloads but you should also think of other benefits such as attracting key authors, as well as developing and strengthening the relationships between publisher and their communities.
Be part of the conversation
The most successful campaigns enable research to embed itself into extended communities of interest and become part of the conversation. I recall working on a journal article publicity campaign about research into the best way to cook your vegetables … two weeks later I read about the research in the weekly musings from my veg box provider … the campaign had gone full circle! Of course this doesn’t happen every time, but when it does; you certainly get a buzz from instigating a successful media campaign.'
Julia Lampam is responsible for Wiley's corporate communications and book fairs within Europe, the Middle East and Asia, and co-chairs the company's global social media group. Previously she managed their publicity activities, instigating proactive PR strategies for academic journals and the For Dummies and Frommer's brands; as well as launching Wiley’s online newsroom.
Julia is co-tutor on Getting the Most from Journal Publicity with Alexa Dugan. Book your place now.
Labels:
alexa dugan,
alpsp,
ALPSP training,
article,
getting the most from journal publicity,
journal,
Julia Lampam,
publicity,
Wiley
Tuesday, 22 October 2013
Reaching readers, and dealing with data: preparing for phase two of publishing’s digital transformation
Alastair Horne is Social Media and Communities Manager for ELT at Cambridge University Press. Here, in a guest post, he reflects on the changing skill set of the wider publishing sector, as Creative Skillset calls for final contributions to their industry panels.
"Whisper it, but the publishing industry has – mostly – coped tolerably well with the first phase of its transition to digital. Ebook sales have risen to account for about a quarter of all trade revenues without destroying the industry’s financial model, while some non-trade publishers like Wiley are now reporting that more than half their income is coming from digital products.
In many respects, publishing was better prepared for digital than the hype would have had you believe, since industry workflows had already been predominantly digital for some time. For most employees, the change has primarily been a question of thinking digitally: being aware of the possibilities offered by digital products and adjusting their thinking accordingly. Where radically new skills were required – in, for instance, the building of digital products such as apps or learning platforms – publishers have mostly addressed the issue by buying in the skills they lacked: typically by hiring external developers, often offshore.
New roles have also grown up in the hinterland between publishers and developers.
A need has arisen for people who can translate the sometimes vague aspirations of a commissioning editor into a document sufficiently precise that nothing is left for a developer with very different cultural assumptions to misinterpret – and then to translate the developers’ responses into terms non-technical editors will understand. People in such roles don't need to be able to code – if they could, they'd be working as coders and getting paid much more – but they do need to be able to understand what it's like to code, and what information a developer requires to do his or her job successfully.
Now, as we enter the second phase of publishing’s digital transformation – focusing on readers and data – we in the industry have an opportunity to ensure that our staff possess the skills that will be required. Creative Skillset – the licensed Sector Skills Council for publishing and other creative industries – is seeking vital feedback from employers and professionals in the publishing industry to identify skills needs and develop solutions to support those needs.
This second phase is likely to follow a similar pattern to the first. As publishers adjust their bifocals to switch from their near-sighted focus on booksellers to the longer-distance vision required to engage with readers, change may follow a similar pattern as before. There will be most likely be some changes to the everyday skills required by publishing employees, some of the more technical skills will be bought in from outside, and some new roles created in the space between the two.
Many of the new roles are already in place at more forward-thinking publishers.
Market research analysts, for instance, and social media and community managers (such as myself), responsible for nurturing relationships across third party owned platforms such as Facebook, LinkedIn and Twitter, and across publishers' own purpose-built platforms. The more technical roles – manipulating massive amounts of customer data – may once again be contracted out to companies better able to provide these specific skills.
The most wide-ranging change will be amongst those employees whose job titles won’t change.
Though their skill set will need to. From commissioning editor to marketer, the ability to put the reader at the heart of every aspect of the business, and to make better decisions informed by the data others are gathering and interpreting, will be vital."
Alastair Horne tweets as @pressfuturist, and blogs at www.pressfuturist.com.
Creative Skillset is the Sector Skills Council for the Creative Industries.
Labels:
@pressfuturist,
Alastair Horne,
Creative Skillset,
digital,
industry panels,
publishing,
research,
skills
Thursday, 17 October 2013
Valuing Intellectual Property. Priorities for New Governments: Meeting Consumer and Business Expectations in the UK and Europe
Vince Cable addresses the audience |
Rick Nye, Director of research and strategy consultancy Populus, kicked off with a few key statistics. They interviewed 2,052 British adults online in September 2013 to explore public attitudes towards intellectual property rights. Some of the findings are striking, while some will come as no surprise. People born after 1980 are twice as likely to commit IP infringement than those born before. IP infringement is a massive issue for the creative industries. Interestingly, respondents were far more comfortable with the concept of goods and services IP infringement, but not so with personal data.
Vince Cable, Secretary of State for Business, Innovation and Skills, stated his long standing interest in IP and copyright. There is wide recognition within the UK government of how important the creative industries are to the UK economy. What creates economic growth? Research suggests 40% comes from innovation of various kinds, which is underpinned by IP and copyright. He believes that Britain is particularly good at the fusion of creative and science, technical and engineering, but there is a challenge around building skills and balancing across science, technical, arts and engineering, to ensure we have the best workforce to adapt and innovate. The work of the Creative Industries Council and Creative Skillset are central to this aim.
There have been positive steps forward in the last few years with the launch of the Copyright Hub and a lot of progress for developing common standards for data standards. It is important for the UK to be the European centre for content licensing. Other initiatives include IP attaches around the world, small claims track in county court, and - reflecting the balance to strike in each area between protection and access - small changes in copyright to allow acceptable consumer behaviour (e.g. transferring across devices).
The audience had the opportunity to take part in regular live polls. When asked 'Has the government been supportive of those who rely on IP?' 16.7% chose 'not supportive', 69% 'quite supportive', and 14.3% 'very supportive'. Panellists Mike Weatherley MP, the Prime Minister's Advisor on IP, Martin Spence, Assistant General Secretary at BECTU and Bill Bush, Director of Policy at the Premier League were in general agreement with this spread of sentiment.
Arlene McCarthy in conversation with Lord Clement-Jones |
Opposition from those who don't like copyright can be strong. Often votes are won only by narrow margins. When people really want something they are highly motivated (e.g. UKIP, Pirate Party, etc) We have to be ahead of and stealthier than them. Industry has to be fit to adapt and tackle the challenge of the internet, but that doesn't mean changing copyright legislation if it's fit for purpose. McCarthy's sentiments echoed those of Eric Merkel-Sobotta, who urged publishers at the recent ALPSP conference to find out who their MEP is and to write, write, write to them to ensure our voice is heard.
She believes that UK industry always has an open door. The move away from Hargreaves has improved our standing in the EU. The ecommerce directive does actually work quite well - fast and quick. A separate notice and take down directive will potentially add more barriers. France and Italy have a big stake in this and are likely to support us. Germany is more ambivalent as their manufacturing is so strong, the creative industries are not as crucial, and they believe that content should be more freely available. For the UK, the creative industries are the only ones holding their own in the financial crisis. We need to support them.
There is a huge opportunity: the internet is there to use in different ways. In the end, the internet needs content - it is its life blood. The creative industries should be more robust about their position. In return for that content, the internet companies should work with us to support us. McCarthy believes there is room for both free and professional content on the internet. And it is not unreasonable that professional content should be paid for. After all, she wouldn't expect a plumber to come and fix her boiler and then not pay him.
The conference closed with a key note speech from Michel Barnier, EU Commissioner for the Internal Market. IP is the backbone to the market with €4.7 trillion generated by IP each year and more than a third of all jobs are directly or indirectly from IP intensive roles.
There is a balance between this and ensuring IP legislation is fit for purpose in the 21st century. IP is not just about jobs and economic value, it's part of life for everyone: not just about how we consume content, but also about rights, access to information and diversity. Copyright must not be a block to content creation. The EU published a road map on developing copyright last December. Reform is long overdue and the aim is to catch-up now. The full transcription of Barnier's talk is available online.
Labels:
#alpsp,
#valuingip,
A4IP,
Alliance for Intellectual Property,
alpsp,
Intellectual property,
IP,
Valuing IP
Tuesday, 8 October 2013
Pixel imperfect: Serving an online audience with responsive content
Michael Cairns, COO at Publishing Technology |
In 2010 Ethan Marcotte coined the term in a landmark article on A List Apart. It is not a new idea, but made possible by recent technologies. Responsive web design is about designing systems, and not websites. It forces us to think bigger and put users and how they use content at the centre of the design process.
The Boston Globe site is a good example of responsive design - resize your browser to see how the content reflows. It's worth bearing in mind that Google has a preference for accessible websites with one set of content and one URL.
Gartner reported that enterprise tablet adoption is growing by 50% per year. Mobile is increasingly important. Now is the time to think about what your responsive web design strategy is going to be. Don't forget that libraries subscribe to a huge amount of content
But it is a confusing landscape: not just Apple, but Android, and for now, Blackberry. When you see stats such as the IBM/Tealeaf report that 85% of users expect that a mobile website should be at least as good as the desktop, you have to move forward with responsive design.
Some considerations:
- Do you want or need to be in the App store?
- Do you rely on or make use of device-specific functionality like the camera?
- Do you have a specific functional focus?
Do you have a content focused approach which requires broad device support? Are there frequent content changes and do you need better discoverability via a third party such as Google? Plan with several things in mind: the audience, content and functionality (Cairns stresses the importance of content strategy), capability, and cost process. Context is very important. With the device, what device is typically used? With the location, where is it used? With time or circumstance, what's the experience (e.g. physicians on the ward)?
It's complicated. Apple iOS has 6 different size/resolution combinations. HTC has 12. Even within these platforms there is significant deviation. And it is getting more complicated with the introduction of Microsoft and Asus tablets.
Cairn's advice on how to do RWD right starts with understanding your users and how they access and use your content. Prioritise your content based on the above, then build a site architecture that answers to these priorities. Design a site that provides content for users across device-types and contexts, with grids and typography and images that adapt.
What is responsive web design? It is where you maintain one website that services all devices and screen sizes. It provides complete support for all web pages and features, regardless of the device or screen size. And it enables you to implement changes across all devices.
Michael Kowalski from Contentment, with a cloud |
Crucially, there's no standardisation with apps. Kowalski took the last ABC audit figures on the PPA website and crunched the data. He found that in magazines and business media, print is seeing a 10% year on year decline while there is 108% growth on digital.
There's a lot of room for growth in magazines and business media. As a sector, they initially tried a number of techniques to get their content onto devices. First of all they did nothing, taking content, putting it into PDF and then on to the app store. But replicas on phones are rubbish. Then they stuck with the familiar and replicated magazine layout. Now they are going with CSS (and similar) media queries honed on the web that can be used to do responsive content. There are a number of tools that can be used for hybrid apps (native apps with HTML5 inside) including PhoneGap, Trigger.IO and pugpig.
Kowalski believes web publishing killed content design. He asked what happened to creative freedom? What happened to designing around our content? Did we struggle in vain? Can't we have those nice things? Developers think template first, squirt content through it later, separate content from presentation. Designers think that a template is a starting point.
Kowalski believes that you can turn one big problem into many small problems. How do you deal with fixed aspect ratio? You could crop, but do you have rights to do that? You can tag an image as portrait, landscape or squarish, using captions. You can use templates with manual override to adapt to different images or use disclosure (+ sign to open up text). If you use tables you can convert each row to a mini table on small screens, add paging or disclosure to avoid long scrolling experience. Fonts can be painfully expensive so open source fonts are well worth investigating.
Solve each of these issues one by one, but what about bigger problems? A little bit of print content can go a long way in digital. There are various ways you can pack more content into the same real estate without it becoming too noisy. But content fitting is a hard habit to kick. Responsive design is a big, hard change for print designers. Web designers are your friends. Kowalski closed by urging the room to think more like digital product designers.
What does the user want to do? How can we make it easy for them?
Labels:
#contec13,
#fbm13,
alpsp,
Contentment,
Michael Cairns,
Michael Kowalski,
Pixel Imperfect,
Publishing Technology
Big Data / Little Data: The practical capture, analysis and integration of data for publishers
Laura Dawson, from Bowker, leans in. |
She cautioned that data doesn't stop with getting something on Amazon. They have tracked the explosion in the amount of books. In the United States there were 900,000 books in print in 1999. This grew to 28 million in 2013. Information is on a massive scale. We are swimming in it.
There is a problem and opportunity in this abundance. The problem is with fluidity - all this information is out of the container. Abundance, persistance and fluidity lead to issues with discovery.
There are four different types of metadata:
- Bibliographic: basic book information, the classic understanding of metadata.
- Commercial: tax codes, proprietary fields.
- Transactional: inventory, locations, order and billings, royalties, etc.
- Merchandising: descriptive content, marketing copy, consumer oriented content.
Part of the challenge of managing metadata are the many different sources. There are publisher prepared files, publisher requests (typically email), data aggregators (e.g. Bowker), social reading sites, online and offline retailers and libraries (remember them?).
Other complicating factors for digital metadata include differential timing (physical books require 6 months prior, digital upon publication). There are different attributes and more frequent price changes. Conversions are often outsourced and, in relative terms, this is a whole new process.
Current metadata practices tend to include creation in 4 primary departments (editorial/managing editorial, marketing, production and creative services). Management responsibility varies by sender. Most publishers treat publication as end date for updates (although this is changing). Complete does not mean accurate, inspection is limited. And prepping metadata is somewhat ad hoc. But it's not all bad news. Many publishing houses are now looking at metadata as a functional map. They are examining the process and putting all data into a metadata repository.
Best practice in organising metadata is emerging. You need a hub - a single source of truth for your data able to deal with multiple contributors and multiple recipients. Design defined roles and provide a single source. Identifiers are much more efficient to search engines than thesauri. Text matching doesn't work across character sets or even languages that use the same characters.
There are a number of codified representations of a concept that should be used as they are helpful to search engines as they are short cuts:
- ISBN - numerical representation of a book
- ISNI - for a name
- GTIN - for a tradable product
- ISO Numeric Country Codes - for a country.
Machine language is key. Codes are easier to process than text, faster and less complex. Codes are unambiguous. Natural language evolves and is more unstable. You can use linking data sets using ISNI. Content's new vocabulary is based upon:
- structured content
- linked data/linked open data
- the semantic web
- ontology
- Good Relations - an ontology devised specifically for describing products for sale
- RDF - Resource Description Framework
- and data visualisation.
Steve Smith, President and CEO at Wiley: Persevering in the Middle
Steve Smith, President & CEO of Wiley |
Wiley traditionally focused its offering in pedagogical support and active teaching evaluation. Through their acquisition of Deltak, they now offer a total turn-key solution for the provision of higher education.
Transformation in the business has to go beyond digital. Go deep: it is no longer enough to be a provider of information. You must build a relationship with your community. Solutions are found through deep knowledge of customer workflows to find ways to solve their pain points and go beyond their needs. You must focus on outcomes, for example, in research recognise the key driver to publish articles is reputation, develop proven outcomes that will support that.
He reflected that it has to be digital. Their scholarly journals business is now 85% digital. They produce highly discoverable, enriched content using enrichment and semantic tagging. However, they still continue to depend on library budgets. And it's a fact these budgets are not growing to keep pace with spending of research and development.
There are some major challenges in the digital marketplace. In some segments of the business, substitution is an issue. Consumers find they can get access that is good enough to solve their needs, free to use and paid for by advertising. There is a change in the balance of power between device manufacturers and content distributors on one hand and content creators on the other. You have empowered and demanding consumers. As we have seen with ebook pricing, digital business models are often weaker than traditional, legacy models.
Wiley have responded by looking at pain points for customers and developing solutions through their value chain. They looked at the research cycle to see where they could provide business solutions to help the community. Smith broke this down into a cycle with four stages:
- Ideation: they provide competitive intelligence, insight and decision support, literature interaction and data review.
- Planning: opportunities to help with grant-writing, compliance and research planning.
- Experimentation: solutions around protocols, data management, data analysis and resource management.
- Dissemination: assistance with data sharing, IP protection, publication and networking.
You must leverage strengths and assets. How do you cope with the challenge of how to develop new business solutions at the same time as enhancing and protecting your core, legacy business? Focus on your content strengths and build on your deep knowledge of the communities you serve. Expand along the value chain of the customer and build/partner/acquire to deliver this (as they did with Deltak).
Innovation that isn't customer led is not going to be successful.
Labels:
#contec13,
#fbm13,
alpsp,
Contec,
content,
publishing,
Stephen Smith,
Steve Smith,
technology
Saturday, 28 September 2013
The Future for Smaller Publishers: Louise Russell's Practical Guide to Online Hosting
Louise Russell: online hosting for smaller publishers |
Aggregation services (e.g. Ingenta Connect, Metapress, HighWire) have a number of advantages:
- visibility - power of the collective
- one-stop shop for librarians (access, COUNTER compliant reports, easier for library technology integration)
- consistent interface for readers
- search engine optimisation
- economies of scale - large sites providing industry standards, up-to-date services.
There are a number of considerations that may be an opportunity or a threat, depending on your circumstances. These include:
- file specifications - that can be quite prescriptive
- level of branding
- ability to customise functionality
- product roadmap - is it a good fit for your strategy?
- level of service and support.
Platform providers are online systems that can be customised to your requirements. Companies include Atypon, pub2web, HighWire, Silverchair and Semantico. Advantages include increased control and the ability to brand. You can expect a better user interface, a wider range of functionality and tool kits, integration with back office systems and customisation. Considerations include the price point which can be higher compared to off the shelf versus more custom options. You need to bear in mind the staff and resource impact internally. A product roadmap is required and think about the ownership of code.
Other alternatives include Open Source solutions such as the Open Journals System or self-hosting. These can provide a range of approaches such as building blocks, involve a roadmap driven solely by you, can provide competitive edge, but require in-house expertise and lack the power of a collective.
When you think about online hosting you need to define the core functionality that is required. Elements include content delivery, access control and e-commerce, adherence to industry standards, integration with library technology, SEO, distribution, usage statistics and reporting. Other optional additional services can include branding and end-user support. If the vendor does not offer the latter, you will need to, but be warned, this can be time consuming.
Increasingly, it is more than journals (books and journal articles). There are emerging business models and you need to consider mobile delivery, semantic enrichment and international visibility. Tailored user experience could include localised services, supporting specific workflows. Online hosting can provide more sophisticated reporting and interoperable services.
When choosing a supplier your selection criteria should include:
- Price point functionality
- Compliance with industry standards
- Digital strategy / roadmap goals
- Age of product
- Service level agreements
- Culture/compatibility.
During the selection process you should aim to crystallise immediate and medium term objectives, define requirements through market research and stakeholder feedback. You need to consider the design and user interface and put in place due diligence such as request for proposal (RFP) and through demonstrations for a robust decision making process.
Managing a transition is critical. Think about the timing of the project and when it falls in the business year so you don't clash with key activity dates. Think about URL redirects, library technology, SEO and customisations. Undertake a content audit and inventory. Think about internal and external communication. You will need to create a project team to manage the process, include stakeholder testing and manage risk through contingency planning.
As a smaller publisher, you don't necessarily have to lag behind the technology adoption curve. Geoffrey A Moore in his book Crossing the Chasm (1991, revised 1999) outlined different groups. Size does not preclude you from being an early adopter or innovator.
- Innovators 2.5%
- Early adopters 13.5%
- Early majority 34%
- Late majority 34%
- Laggards 16%
When creating a digital strategy consider the wider business goals, define your success metrics, do market research and put in place building blocks for agile, flexible solutions. Make sure you have analytics (to measure, monitor and adapt) and a release strategy for launch. Work to industry standards (e.g. COUNTER 4, FundRef, CrossMark, ORCID, NLM DTD - JATS, KBART 1.0). Look at wider industry initiatives such as CHORUS, Kudos, Projects and sipx.
Russell closed with her blueprint for undertaking online hosting: define your digital strategy, identify success metrics, measure, monitor and refine and look at operational structure. The selection of host should reflect these core principles.
Labels:
#alpsp,
#alpspsme,
alpsp,
Future for Smaller Publishers,
Louise Russell,
online hosting,
Tutton Russell Consulting
Thursday, 26 September 2013
The Future for Smaller Publishers: Strategic marketing
Camilla Braithwaite: strategic marketing is key |
Strategic marketing is how a company can position itself within the competitor environment and is the business end of marketing (not promotional) which focuses on who customers are, what their needs and problems are, and what products and services you can develop to meet these requirements.
Smaller publishers face a number of challenges including (a lack of) market share and limited resources. They have to tackle the needs of a younger generation and find a way to negotiate open access mandates and new technology. In addition, competition for resources and building brand awareness can also be tricky.
Understanding your environment is the first step
Consider competitors' activities - who are they and what are they doing? Think in terms of your customer's workflow - who competes for your attention (it's not just other publishers, non-traditional competitors need to be considered). Author awareness is key - be aware of their rights. With many new entrants volume of content has a major impact on businesses large or small so discoverability and awareness are as important as ever. As there is continuing dissatisfaction with big profits from both librarians and researchers, this presents an opportunity for smaller organisations.
SMEs have a number of key strengths
These include proximity to community base, niche knowledge, understanding, being viewed as the 'good' in publishing, and flexibility. A small publisher's greatest asset is the connection to your community. Use your grass roots approach for advocacy, use your customer connections. Many smaller publishers will have Facebook, Twitter, a blog or forums they can use and you can encourage, reward and debate with customers through these channels. A lot of publishers are taking an advocacy approach using social media (e.g. British Ecological Society and SAGE Social Science) where you can position yourself at the heart of the community.
What else does your brand represent for your customers? Is it seen as relevant and useful? Can you improve its profile among a new generation of researchers? Use the external perspective of key stakeholder groups to build up your brand values - then identify gaps to work on building something stronger and more relevant for them.
Be agile and innovative
Because you are smaller you can respond to changes around you and harness creative ideas within your organisation. Innovative doesn't have to mean expensive or be about bells and whistles. Jason Hoyt, co-founder of ALPSP Publishing Innovation Award winner PeerJ, recently said that rather than being content driven, publishers need to be more technology driven. What about being customer driven? Innovation workshops are a way to help you leapfrog barriers to meet a need that customers don't know they need yet.
Good sales and marketing strategies focus on strong territories and partnerships
This will enable you to build on areas with the most potential and is a good way to get global representation. Set targets and use a dashboard to monitor, control and understand your sales pipeline. Assess where you have market penetration which can be increased with some work and use targeted sales support to help achieve that (e.g. telephone).
Think about introducing clever pricing
Model it around the size of institution and adopt tiered pricing models. Analyse current sales and model different subscribers to identify four different tiers. This works for customers, is manageable for subscription agents and can improve your sales. A key part of keeping attrition low with tiered pricing is the communications with institutions when you introduce it. Keep as transparent as possible.
Labels:
#alpsp,
#alpspsme,
alpsp,
Camilla Braithwaite,
Future for Smaller Publishers,
Suzanne Kavanagh,
TBI Communications
Subscribe to:
Posts (Atom)