Tuesday 30 September 2014

Does innovation help smaller publishers?


These are exciting times in scholarly publishing, as ALPSP 2014 so amply demonstrated. But is technical innovation leaving the smaller publisher trailing behind better-resourced and larger competitors and the big technology companies? It ain’t necessarily so…

The early ‘electronic’ decades were characterized by high entry costs, modest technical skill levels, uncertainties and risk aversion. Only the larger publishers would venture to take the first cautious steps. Now the pace of technical change is almost dizzying, innovation is rampant, entry costs have fallen, skill levels are higher and, despite some apprehension, even small publishers are more comfortable with taking risks.

The fundamental elements of scholarly publishing haven’t changed that much. Authors still want to publish in the ‘best’ journals representing their community, with fast, reliable and responsive peer review. Libraries and readers still want affordability, timeliness, responsiveness to their needs. Price is important but quality and impact weigh heavily. None of these criteria are barriers to the smaller scholarly publisher.

Despite the rapidity of change, technical innovation has lowered the entry barriers, making it easier and less risky for smaller publishers and start-ups to provide new products and services: think of Peerage of Science for peer review, figshare for managing and sharing research output, DeepDyve for low-cost full-text preview, PACKT using Google to monitor technology trends and turning them into publishing opportunities, and many more initiatives reported at ALPSP 2014. Vast collaborative effort is going into standardization, data structuring, interoperability to level the playing field and make it easier for players to concentrate on their own USPs. A whole industry of service providers and outsourcers ensure that small scale is not an obstacle: intermediaries, hosts, discovery services, technology partners, CCC, CrossRef, COUNTER and more.

With all this help, small operators have the additional advantage that they can be more nimble and exploit opportunities more quickly than larger organizations. The proviso is that they understand their role, that they stay in close touch and work with their chosen academic communities, respond to their needs and constantly reinvent themselves (Coco Chanel: “In order to be irreplaceable one must always be different”). Even the big beasts in the field can be helpful: Google does offer well-used, free discovery services and is responsive to complaints about the negative impact of search algorithm changes. There were calls for more lobbying of services like Google, but the question of their market dominance is perhaps best left to the competition authorities.

‘Bells and whistles’ was the somewhat pejorative term used for service characteristics that are nice to have but miss what is really important to authors. For example publishers and librarians care deeply about version control, but researchers don’t. There are a fair number of ‘predatory’ journals that offer little or no peer review even though the research community still values it highly. Impact factors are important for publishers but researchers, particularly younger ones, care more about the societal impact of their work, also reflected in the changing criteria of the Research Excellence Framework. Small publishers need to have these conversations with the research community and understand what matters to them.

It is to the credit of ALPSP that it has always been aware of the special concerns of niche publishers. It has done its best to foster skill building, provide networking opportunities, encourage standards and service provision and bring the different parts of the scholarly publishing universe together. No wonder its international conferences have been so successful!

Kurt Paulus
kurtpaulus@hotmail.co.uk

Friday 26 September 2014

The data deluge is upon us… are you ready?

Today sees the online publication (under an open access model) of our special data issue of Learned Publishing.

Produced with the support of Wiley, this collection of papers represents a snapshot of current thinking about research data from a variety of perspectives. It is guest edited by Alice Meadows, Director of Communications at Wiley and Fiona Murphy, their STM Publisher.

This special issue is launched at the end of a week where Wiley Exchanges published some fascinating posts on different aspects of data to coincide with the Research Data Alliance annual conference in Amsterdam.

Liz Ferguson, Publishing Solutions Director at Wiley hit the nail on the head with her observation “Acknowledging the significance of data in scholarly communication is one thing, but knowing what to do about it is another” in her piece Everybody Loves Data.

Jennifer Beal, Events & Ambassador Manager at Wiley observed “Ah Big Data, how things have changed!” in her write up of the Who’s Afraid of Big Data session from the ALPSP conference.

“Do you want to use my environmental data, or I yours? The question pulls in many conflicting directions.” Mike Kirkby, Emeritus at Leeds University reflected on the many questions with complex answers that the use and storage of data presents in  More data, more questions?

Fiona Murphy interviewed Mark Hahnel, Founder of figshare who believes that “Opening up research data has the potential to both save lives (say with medical advances) and to enhance them with socio-economic progress. It’s a space where humans and computers can work symbiotically, and where industry can also benefit.” He goes on to share his thoughts on the practicalities of opening data, blockages in the system and the potential for open science.

As more and more colleagues across the scholarly publishing community engage with open data, we hope this special issue will help them along the way.

Thursday 25 September 2014

What societies need to know about Creative Commons (CC) Licensing - free webinar

Want to know what a CC license is and what all the different types mean?
Which CC license is best for your journal’s authors and the future of your journal(s)?

What societies need to know about Creative Commons (CC) licensing
Tuesday, September 30th, 8-9am PDT/11am-12pm EDT/4-5pm BST

With new mandates being announced by funders globally for Open Access archiving of their funded research, societies need to understand what the different CC article licensing options mean, both for their journals’ and their members’ needs.  This webinar will provide society executives with an overview of what they need to know about CC licences. Do you know your CC BY from your CC BY-NC-ND?  What are the pros and cons of different CC licenses for society journals? What options should you give your authors?

With speakers from Creative Commons, Copyright Clearance Centre and Wiley.

To register your place, visit http://goto.copyright.com/LP=981

This webinar is organised by Wiley in conjunction with the Copyright Clearance Center

Tuesday 23 September 2014

Big data: mining or minefield? Kurt Paulus reflects...

Who's Afraid of Big Data? Not this panel...
"Data are the stuff of research: collection, manipulation, analysis, more data… They are also the stuff of contemporary life: surveillance data, customer data, medical data… Some are defined and manageable: a researcher’s base collection from which the paper draws conclusions. Some are big: police data, NHS data, GCHQ data, accumulated published data in particular fields. Two Plenaries and several other papers at ALPSP 2014 addressed the issues, options, opportunities and some threats around them.

There have long been calls for authors’ underlying research data to be made accessible, so as to substantiate research conclusions, suggest further work and so on. The main Plenaries concerned themselves with Big Data, usually unstructured sets of elements of unprecedented scale and scope, such as the whole of Wikipedia, accumulated Google searches, the biomedical literature, the visible galaxies in the universe. The challenge of ‘mining’ these datasets is to bring structure to them so that new insights emerge beyond those arising from limited or sampled data. This requires automation, big computing resources and appropriate but speeded-up human intervention and sometimes crowd sourcing.

Gemma Hersh from Elsevier on TDM
Text and data mining has some kinship with information discovery where usually structured datasets are queried, but goes well beyond it by seeking to add structure to very large sets of data where there is no or little structure, so that information can be clustered, trends identified and concepts linked together to lead to new hypotheses. The intelligence services provide a prime, albeit hidden example. So does the functional characterization of proteins, the mining of the UK Biobank for trends and new insights or the crowd-sourced classification of galaxy types to test cosmological theories.

Inevitably there are barriers and issues. The data themselves are often inadequate; for example not all drug trials are published and negative or non-results are frequently excluded from papers. Research data are not always structured and standardized and authors are often untutored in databases and ontologies. The default policy, it was recommended, should be openness in the availability of authors’ published and underlying data, standardized with full metadata and unique identifiers, to make data usable and mitigate the need for sophisticated mining.

CrossRef's Rachael Lammey
Because of copyright and licensing, not all data are easily accessed for retrieval and mining. Increasingly licensing for ‘non-commercial purposes’ is permitted but exactly what is non-commercial is ill-defined, particularly in pharmaceuticals. Organizations like CrossRef, CCC, PLS and others are beginning to offer services that support the textual, licensing and royalty gathering processes for both research and commercial data mining.

Rejecting the name tag Cassandra, Paul Uhlir of the National Academies urged a note of caution. Big Data is changing the public and academic landscape, harbouring threats of disintermediation, complexity, luddism and inequality and exposing weaknesses in reproducibility, scientific method, data policy, metrics and human resources, amongst others.


Paul F. Uhlir urges caution

Judging by the remainder of these sessions and the audience reaction, excitement was more noticeable than apprehension.

ALPSP of course is on the ball and has just issued a Member Briefing on Text and Data Mining (member login required) and will publish a special online-only issue of Learned Publishing before the end of this month."





Kurt Paulus
kurtpaulus@hotmail.co.uk

Friday 19 September 2014

Open Access rules OK – almost? Kurt Paulus reflects one week on from the ALPSP International Conference

Toby Green (centre) asks 'Why are we still not there?'
“Twenty years since Harnad, ten years since Budapest, but why are we still not there?” asked Toby Green in the second Plenary at ALPSP 2014. Well, the venerable Royal Society has launched its first OA journal, Open Biology, and has survived the experience. They are only one of many scholarly publishers: Phil Hurst of the RS claimed some 50% of learned societies are planning OA journals. Jackie Jones from Wiley gave structured advice on how and when to ‘flip’ the revenue model from subscription to Gold OA. So where are we?

It seems that much of the hesitancy surrounding this topic is fading away and we are now looking at how rather than whether to do it. Practical questions come to the fore: how long do we give authors APC waivers before they become fully liable (1-2 years), and what groups are entitled to more permanent waivers. The key customers continue to be the authors, but so too are funding bodies who underwrite APCs, and institutions who are targets for membership schemes. More internal customer service silos must disappear as the whole workflow is rethought.

If everything is rosy in the garden, why did Open Access pop up in almost every session? Well, we are still in the transitional stage between Hybrid, Green and Gold, and progress towards a more common approach is still patchy. The mandating issues are very much on the table, with only the UK and USA relatively self-assured. In the EU the new Commission will need time to settle in, and mandating policy may not be its first priority. Thinking about OA in Australia and New Zealand is positive, as it is in China – where scientific research output is blossoming – while developments in South America, perceived as a significant future market, are less coherent.

Hybrid, Green or Gold? You decide.
Behind the front line of even small publishers taking the plunge, there are other developments that shine a light on the changing landscape. Ottawa University Press is in partnership with the university library which financially supports some OA book titles. OECD, amongst others, uses the 'freemium' model, where read-only is free but download, print and other services are paid for, and it claims it works. OA repositories are adding to the exposure of published works, and bring PhD theses and research reports to readers’ attention. Scholarly publishing in libraries is growing the younger author pool, a trend rather more prominent in the USA than the UK.

Simon Ross (left) presents Fred with his Award
Many details remain to be chewed over: Should APCs apply only to published papers when rejected papers also incur a high cost? ‘Value for money’ becomes more of an issue now that the cost of publishing is out in the open. It is in the joint interests of libraries and publishers to support each other; what’s the best way to maximize exposure and discovery of the work? Almost the last comment of the last Plenary was that most if not all new journal launches will be Open Access; another example of the glass half full.

There was genuine delight when Fred Dylla, CEO of AIP and a driver of clear thinking about OA policies was announced as the winner of this year’s ALPSP Award for Contribution to Scholarly Publishing.


Kurt Paulus
kurtpaulus@hotmail.co.uk

Monday 15 September 2014

A call to all ambitious society publishers from Susan Hezlet

ALPSP Committees and Council are at the heart of what we do. They are a group of dedicated industry volunteers who advise, steer and guide the secretariat, providing strategic direction and practical support so we can connect, inform, develop and represent the scholarly publishing community.

Susan Hezlet, Publisher at the London Mathematical Society and outgoing member of the ALPSP Council, reflects on the pains and pleasures of volunteering in this guest post.

"About ten years ago I was asked to give a talk at an ALPSP seminar, it wasn't particularly good, but the next thing was a request 'would I join the Professional Development Committee?' This involved co-organising a seminar - I think it was with Felicity Davie and Karen Hillmansen. That was fun, I learnt a lot in the process and we were guided and kept to the task with the help of Lesley Ogg.

The next request was 'would I be Treasurer of ALPSP?' Five years and a lot of direct involvement in authorising payments, helping with the interviewing and appointment of a Chief Executive, being asked to attend meetings on open access policy development, trotting over to the Department for Business, Innovation and Skills to tell them how important scholarly publishing is - not that they still do regular meetings since Vince Cable moved in.

Then it was 'would I do a second term?' This was followed by me saying no, but it would be nice to do one more year on Council as an ordinary member. Three years later... and I'm finally finished.

Perhaps this doesn't sound like fun? However, it is perhaps the best free education you can get in publishing! I have had good times with my fellow publishers on the PDC and ALPSP Council, all of whom are way more senior and knowledgeable than me. All of the people who work for ALPSP are highly professional and generous in the extra time they devote to supporting the rest of us.

It was through ALPSP connections that I managed to find and persuade some excellent publishers and consultants to come and join our Publications Committee at the London Mathematical Society. I have learnt a great deal from them and over the years it has transformed the committee, from one where the Editors spent time on the use of the Oxford comma, to a business committee with a strategic plan (thanks Mark!)

As a small society publisher who spends most of her time being a pain in the arse for my larger publishing distributors, I would not have had the confidence to ask and occasionally insist on a decent service from them without the conversations and support of people involved with ALPSP.

On the second day of the 2014 conference, while I should have been listening to even more good advice from the great speakers we had at the event, I was tucked away with the first round of business proposals from no less than nine publishers. Without the networking and experience of dealing with senior publishers in ALPSP, I would not have known where to begin. It has been a fascinating read and one of these days I will write my memoirs...

Finally, I have made some good friends. In the long run, there is nothing more important.

So. This is a call to all ambitious Society Publishers who want to learn something beyond their own field of publishing; Volunteer!"

Huge thanks go to Susan for her time on PDC and Council. She can be contacted via the London Mathematical Society website.

Friday 12 September 2014

Open access: the daily challenge (new customers, processes and relationships)

Phil Hurst, Jackie Jones, Wim van der Stelt
Springer's Wim van der Stelt chaired the final panel at the 2014 ALPSP International Conference. The panellists reflected on the daily challenges of starting an open access product and how the new business model fundamentally alters the publication chain.

Phil Hurst from the Royal Society talked about how they approached launching OA journals. They had a gap in their portfolio and they thought the best way to do this was to launch an OA journal. Open Biology was their first online only journal.

They learnt that many things are the same. Getting the right people on the board is key: you need top people. Content is still king and they need to get the journal in the right places. Open access is a big benefit to everyone. They didn't really realise until they launched the journal. The benefits to stakeholders include speeding up science. There is greater visibility for universities and funders. For researchers there is increased visibility and results. It was good for them to get involved in the OA community.

Much of the marketing is the same, but they supplemented this with an OA membership. Authors from member institutions receive a discount on pure or hybrid access. They incorporated a wider range of metrics including DORA and Altmetrics as a range of measures of research impact. It has also provided them with a springboard. They launched the journal to learn and prepare for the future. OA is consistent with the mission of learned societies. Sustainable? The jury is still out, we will only learn by putting OA journals to the test.

Alex Christoforou from BioMed Central asked who is the actual customer for an OA publisher? They have hundreds of journals across BMC and Springer with hundreds of members, thousands of authors and transactions. What they all need is access to the publisher, support and excellent customer service.

Alex Christoforou
They continue to provide some good old fashioned and reassuring tools to support authors such as a fax number, photos of the team on the web and lots of different ways to contact them. They have 4000 customer service queries each quarter that they deal with. It's increasing as well. They have to provide some kind of service 24 hours a day so they can turn around queries in one day. They even provide online banking for those that spend large amounts of money.

Customer services is not just complaints, payments and author services. It's a way of thinking across the organisation so that all stakeholder groups can build a constructive relationship with them and business can grow over time.

Jackie Jones from Wiley talked about subscription versus open access and 'flipping' journals. They have flipped eight journals so far and it is early days. Some of the key flip criteria they assess include
whether it has a modest subscription revenue. Typically these are young journals that haven't achieved predicted revenues. They also look at areas where there is evidence of good OA funding and existing hybrid activity. Where there is longer term growth potential or attractiveness to authors. They also consider ratio of current revenue to articles.

For publishers flipping can lead to potential for faster growth, but there is higher volatility of revenues. From a society perspective, it provides an opportunity to explore OA. However there is risk commercially and editorially. From the funder point of view some prefer full OA journals but Gold OA is the only option. From an institutional point of view there is no subs fee, but you have to track APCs and costs. For tools and modelling they have flow charts and decision trees to help monitor and track submissions and revenues.

EMBO Molecular Medicine launched in 2009. It had modest subscription sales and the society had concerns about visibility. There was an 85% rejection rate. The per page publication charge was 125 euros and pre-flip the average author cost was 1600 euros. They set the APC at 3000 euros in line with other journals in the society's portfolio. Submissions doubled in the launch period. However, on other journals there was a dip in submissions initially, but they do recover.

They have learnt you need to plan ahead and time communication really carefully. Make sure papers in hand are under the new model so you don't have to waive fees. Don't flip mid year and avoid complications of subs reimbursements. Undertake submissions and publications surveys.

Welcoming the robots

Mark Bide, Chairman at the Publishers Licensing Society chaired the penultimate panel at the ALPSP International Conference on text and data mining (TDM).

Gemma Hersh, Policy Director at Elsevier talked through the Elsevier TDM policy. It has been controversial with calls to change it. Central to their policy is the use of the ScienceDirect API, designed to help preserve the performance of website for everyone else.

One controversy is that a license is Elsevier's way of exerting control. However, they have a global license (which complies with the UK copyright exception and balances with copyright frameworks). Another complaint is around the click through agreement: critics believe it controls what researchers are doing and takes control away from libraries to place liability on researchers. However, it is an automatic process, there is no additional liability, it is aligned with institutional e-amendment, provides guidelines on reuse and can offer one to one support.

Another complaint is that they didn't allow text mining of images. The reason was they did not hold copyright in all the images so they would do it on request. However, they now do it automatically and include terms of use flagging when they need to contact the copyright owner where it doesn't lie with Elsevier.

There were criticisms that they were trying to claim copyright over TDM output. This was inadvertent and they have adjusted the policy to be a little more flexible and take this into account. A final misconception was that the policy was rigid.

In Europe, they have signed a commitment to facilitate TDM for researchers, but their policy is global. They are also a signatory of CrossRef and think the new service is good.

Mark Bide introduces the panel
Lars Juhl Jensen, based at the Novo Nordisk Foundation Center for Protein Research at the University of Copenhagen, provided an academic perspective on TDM. He considers himself a pragmatic text and data miner. The volume of biomedical research that he has to read is huge. Making sense of structured and unstructured data is key. All he wants to do is data mine. It enables him to do things such as associate diseases and identify conditions. Once you've got the data from text mining, you can then bring it together with experimental data, and from other sources.

As a researcher doing text mining, he needs the text. He doesn't want much else. The format doesn't matter too much. If he can get it in a convenient format, great. The licence has to be reasonable.

Andrew Clark, Associate Director Global Information and Competitive Intelligence Services at UCB,  articulated what TDM means and the part it plays in the scientific industry. He recounted the work of the Pharma Documentation Ring (P-D-R). Their aims are to:

  • Promote exchange of experience/networking among members
  • Encourage commercial development of new information services and systems
  • Jointly assess new and existing products and services
  • Provide a forum for the information industry

Gemma Hersh, Lars Juhl Jensen and Andrew Clark
Literature patent analysis, sentiment analysis and drug safety are just a few of the benefits of TDM. One of the challenges is around the unstructured format that the data comes in at. They need several aggregators to make the data mineable. It's not always easy to get the datasets - from small publishers to large ones. It's quiet expensive and labour intensive.

There are high costs for setting up your data mining. There are a lack of technical skills in the organisation.

There are benefits to TDM that include a managed and in some cases auditable processes for protecting IP. It provides added value and potential new revenues streams. Clark closed with a call for industry collaborations and asked everyone to watch this space.

Thursday 11 September 2014

ALPSP Awards spotlight on... Frontiers, a community-run open-access publisher and research network

Kamila Markram is co-founder and CEO of Frontiers
The ALPSP Awards for Innovation in Publishing will be announced at the conference this week. In the final post in our series about the finalists, Kamila Markram, co-Founder and CEO of Frontiers, answers questions about the Frontiers Open-Science platform.

ALPSP: Tell us a bit about your company

KM: We founded Frontiers in 2007 to enable researchers to drive open-access publishing. To achieve this, we built an Open-Science platform with innovative web tools that support researchers in every step of the publishing process. These include collaborative peer review, detailed article and author impact metrics, democratic research evaluation and social networking.

From our beginnings as a group of just a few scientists, Frontiers has evolved to be the fourth leading open-access publisher worldwide. We have published almost 24,000 articles and are on track to publish our 30,000th article before the end of 2014. Our portfolio of open-access titles is also growing rapidly: in just 7 years, we have launched 48 open-access journals across all STM fields.

ALPSP: What is the project that you submitted for the Awards? 

KM: The Frontiers Open-Science platform, which embodies our community-driven philosophy and hosts our innovative online tools to improve all aspects of reviewing, publishing, evaluating and disseminating articles.

Frontiers is a community run, open access academic publisher and research network

ALPSP: Tell us more about how it works and the team behind it. 

KM: Our growing community consists of almost 50,000 leading researchers on the editorial board and more than 100,000 authors. In Frontiers, researchers run the journals and take all editorial decisions. Behind the scenes, we have a team of 140 employees in our headquarters in Lausanne and in offices in Madrid and London. These include mainly journal managers and editorial assistants, who support our editors and authors in the publishing process, as well as software engineers who continuously develop our publishing and networking platforms. We are a highly motivated and dynamic team, of whom many hold Ph.Ds in diverse disciplines, and from many nationalities. Crucially, we all believe that science forms the basis of modern society and that we need to improve the publishing process, so that we all, researchers and society, can benefit from a discovery as quickly as possible.

ALPSP: Why do you think it demonstrates publishing innovation? 

KM: Our approach is unique: we work with leading researchers across all academic communities and empower them with our latest custom-built web technologies to radically improve publishing.

We introduced the novel concept of “Field Journals” – such as Frontiers in Neuroscience – which are structured around academic communities and into specialty sections, such as Neural Circuits, with their own editorial board and which can be cross-listed across journals. This modularity gives synergy between related disciplines, strengthens niche communities, and makes it easy for authors and readers to find the content that interests them.

Also central to our publishing model is the Collaborative Peer Review we introduced. It safeguards authors' rights and gives editors the mandate to accept all articles that are scientifically valid and without objective errors. The review occurs in our online Interactive Review Forum, where authors engage in discussions directly with reviewers to improve the article. It is constructive and transparent, because we publish the names of reviewers on accepted articles. This ensures high quality of reviews and articles. It works – as confirmed by our high impact factors, views and downloads. On top of that, our online platform makes the process fast – with an average review time of 84 days.

We were also the first publisher to develop, in 2008, detailed online article metrics to measure views, downloads, and shares with a breakdown of readership demographics, and we were an early adopter of a commenting system for post-publication evaluation. And we are also the only publisher that uses these article-level metrics, not only to highlight the most impactful articles as selected by thousands of expert readers, but also to translate these discoveries into “Focused Reviews” that make them more accessible to a broader readership. Post-publication evaluation at Frontiers is democratic and objective, using the collective wisdom of numerous experts.

Lastly, we are the first and only publisher to completely merge our own custom-built networking technology with an open-access platform, to raise the visibility and impact of authors and disseminate their articles more efficiently — readers are provided with articles that are the most relevant to them.


Frontiers - Open-Access Publication and Research Networking from Frontiers on Vimeo.

ALPSP: What are you plans for the future? 

KM: To keep growing, innovating and providing the best tools and service. We will continue to bring our publishing model to all STM fields, and also across the humanities and social sciences in the near future. At the same time, we are improving our networking platform, to enable even better dissemination of articles and to increase visibility and impact. Another growing initiative is Frontiers for Young Minds, a new science journal for kids. Young people – aged 8 to 15 – act as reviewers of neuroscience papers by leading scientists. It is a fun, important and engaging way to get children curious about science and let scientists reach out to a young audience. Launched less than a year ago, Frontiers for Young Minds has already been listed as one of the “Great Websites for Kids” by the American Library Association. And by popular demand, we are now about to expand the project across other fields, including astronomy, space sciences and physics. The ALPSP Awards for Innovation in Publishing are sponsored by Publishing Technology.

The ALPSP Awards for Innovation in Publishing are sponsored by Publishing Technology. The winners will be announced tonight at the ALPSP International Conference Wednesday 10 - Friday 12 September, Park Inn Heathrow, London.

Follow the conversation via #alpsp14 and #alpspawards on Twitter.

Who's afraid of big data?

Who's afraid of big data? panel
Fiona Murphy from Wiley chaired the final panel on day two of the 2014 ALPSP International Conference. She posed the question: how do we skill up on data, take advantage of opportunities and avoid the pitfalls?

Eric T. Meyer, Senior Research Fellow and Associate Professor at the University of Oxford was first up trying to answer. He observed how a few years ago you would struggle to gain an audience for a big data seminar. Today, it's usually standing room only.

Big data has been around for years. People were quite surprised when Edward Snowden leaked the NSA documents via Wikileaks, but it had been going on for a long time. Big data in scholarly research has also been around a long time in certain disciplines such as physics or astronomy. There was always money to be made in big data, but there's even more now, and everyone is starting to realise it. So much so, you need a big data strategy.

Meyer defines big data as data unprecedented in scale and scope in relation to a given phenomenon. It is about looking at the whole datastore rather than one dataset. Big data for understanding society is often transactional. We're talking really big. If you can use it on your laptop, it won't be big data.

Meyer drew on some entertaining examples of how big data can be used. If you key in the same sentence in different country versions of Google you'll see the variety of responses change. There are limits to big data approaches, they can come up with misleading results. What happens when bots are involved? Does it skew the results? The challenge will be how you can make it meaningful and more useful.

David Kavanagh from Scrazzl reflected on how the challenge researchers face when making decisions about how to structure and plan your experiments. If you want to leverage collective scientific knowledge and identify which products you want to use for your work, there wasn't a structured way of searching of doing this. Kavanagh urged publishers to throw computational power at data and content as a way to solve problems, improve how you work and help make sense of unstructured content.

That's what they have tackled with Scrazzl which is a practical application of structured or unstructured data that Eric Meyer mentioned. You need to have a product database. Then you have to cut out as much human intervention out as you can. Automation is key. Where they couldn't find a unique identifier or a catalogue match, they had to make it as fast as possible for a human to make an association. Speed is key.

Finally, they built a centralised cloud system that vendors could update their own records. It's a crowd sourced system for those who have a vested interest in keeping it up-to-date. The opportunity for them going forward will be through releasing this structured information through unstructured APIs to drive new insights. It also allows semantic enablement of content and offers the opportunity to think about categorisation in new ways.

For publishers running an ad supported model, they can get use the collection of products from the content search and then identify which advert is the most suited for you.

Paul F. Uhlir
Paul F. Uhlir  from the Board on Research & Information at The National Academies  observed that even after 20 years, we have yet to deconstruct the print paradigm and reconstruct it on the Net very well. In the 1980s a gigabyte was considered a lot of data. in the 1990s. a terabyte was a lot of data. In this decade, the exabyte era is not far ahead of us and a whole lot of others ahead of it.

Huge databases in business, mining marketing information and other data. The Internet of Things and semantic web. Everything now can be captured, represented and manipulated in a database. It's an issue of quantity. But there is also an issue of quality. There needs to be a social response. There are a series of threats around big data.



Disintermediation

The rise of big data promises a lot more disruption. Think about 3D printing. The consequence could be millions of product designers specifying items. Manufacturing will be affected. Jobs will be lost. What will happen to the workers in a repair and body shop when cars are driverless? What will happen to the insurance industries. Workers will be disintermediated. What is certain is that there will be massive labour shifts and disruptions.

Playing God

Custom organs for body parts. The ability of insert genes into another organism. All these applications are data intensive and will become even more so. They have profound social and ethical issues and have potential to do great harm.

End of Privacy

Meyer touched on the NSA files. What about spying satellites? The ubiquity of CC TV? These images are kept in huge databases for future use. Product information is held and used to identify preferences by private companies. There is no such thing as privacy any more.

Inequality

Big data are increasingly powerful means to increase hold on global power.

Complexity

The more we learn, the less we know. Any scientist will tell you that greater understanding leads to more questions than answers.

Luddite reactions

The reaction of people to the encroachment of strange and frightening techniques of the technology age where through passive resistance they try to lead a simple life.

There are also a number of weaknesses that centre around:

  • Improving the policies of the research community
  • New or better incentive mechanisms versus mandates
  • Explicit links of big data to innovation and tech transfer
  • Changing legal landscape-lag in law/bad law/IP law
  • Data for policy-communicating with decision makers.




Industry updates: Publons

Andrew Preston
Andrew Preston from Publons outlined their focus on peer review. As a crucial part of the publishing process, peer review is a leading indicator; it's what the experts thought. It is also valuable content.

Publons is about recognition for good review and a measurable research output for reviewers and editors. It is a proof of quality review process for journals.

They believe openness breeds quality. They provide tools for editors. They measure impact, help them engage with reviewers, and assist with finding, vetting and connecting with reviewers. Finally they build communities to help generate engagement, combining pre- and post-pub reviews with searchable, indexed content.

Publons in numbers

The journal adds review. Publons emails a unique token to reviewer. The reviewer signs in and selects privacy settings. They combine and respect views of journal editor as well as reviewer on how much content they show on Publons. There are two versions of the review - public and private. They have found that regular reviewers review up to 50 articles a year. It can be a leading indicator of expertise.

The service highlights the quality of review process by showing reviews and reviewers. It helps building a community. It helps with Almetrics as they always link to your content. Every review on Publons is eligible to appear in Altmetric so you will never see a zero again. This helps to build out the long tail of articles that don't get picked up in the press! They always link back to content. The article is not their focus so they generate clicks back to the publisher's website. There are a suite of editor tools and it helps them find a reviewer. For publishers, it helps to get more submissions as well as better and faster reviews. It boosts article level metrics and generates post-publication discussion.

Further information available on the Publons website. If you are a publisher and would like to see some metrics about reviews on Publons, complete this online form.

Metrics and more

Melinda Kenneway on metrics
Publication metrics are part of a much bigger picture. Where resources are restricted and there is a lot of competition, metrics become more essential to help prioritise and direct funding. The 'Metrics and more' 2014 conference session was chaired by Melinda Kenneway, Director at TBI Communications and Co-founder of Kudos. The panel comprised Mike Taylor, Research Specialist at Elsevier Labs, Euan Adie, Founder of Altmetric and Graham Woodward, Associate Marketing Director at Wiley. Kenneway opened by observing that as an industry we need to consider new types of metrics.

Publication performance metrics include:
  • Anti-impact factor: DORA
  • Rise of article level metrics
  • introduction of almetrics
  • New units of publishing: data/images/blogs
  • Pre-publication scoring (Peerage of Science etc)
  • Post-publication scoring (assessment, ranking etc)
  • Tools for institutional assessment

Researcher performance metrics include:

  • Publication output
  • Publication impact
  • Reputation/influence scoring systems
  • Funding
  • Other income (e.g. patents)
  • Affiliations (institutional reputation)
  • Esteem factors
  • Membership of societies/editorial boards etc
  • Conference activity
  • Awards and prizes

Institutional performance metrics include:

  • University ranking systems
  • Publication impact metrics
  • STAR/Snowball metrics
  • Research leaders and career progression
  • Patents, technologies, products, devices
  • Uptake of research

Graham Woodward, Associate Marketing Director at Wiley, provided an overview of a trial of altmetrics on a selection of six titles. On one article, after a few days of having altmetrics on the site, they saw the following results: c. 10,000 click throughs; average time on page over three minutes; over 3,500 tweets; an estimated 5,000 news stories; 200 blog posts; and 32 F1000 recommendations.

Graham Woodward
They asked for user feedback on the trial and the 50 responses provided a small but select snapshot that enabled them to assess the effectiveness of the trial.

Were the article metrics supplied on the paper useful? 91% said yes. What were the top three most useful metrics? Traditional news outlets, number of readers and blog posts. 77% of respondents felt the experience enhanced the journal.

Half of respondents said they were more likely to submit a paper to the journal. 87% used the metrics to gauge the overall popularity of the article, 77% to discover and network with researchers who are interested in the same area of their work and 66% to understand significance of paper in scientific discipline.

What happened next? The completion of six journal trial was followed by an extension to all OA journals. They have now rolled out metrics across the entire journal portfolio

Euan Adie from Altmetric reflected on the pressures and motivations on researchers. While there is a lot of pressure within labs for young researchers, funders and institutions are increasingly looking for or considering others types of impact, research output and contribution. There is an evaluation gap between funder requirements and measuring impact. That's where altmetrics come in. They take a broader view of impact to help give credit where it is due. HEFCE are doing a review of metrics within institutions at the moment.
Euan Adie

Seven things they've learnt in the past year or so.

  1. Altmetrics means so many things to so many people. But the name doesn't necessarily work. It is complimentary rather than alternative and it is about the data, not just the measure.
  2. It's worked really well for finding where a paper is being talked about where they wouldn't have known before, but also the demographics behind it.
  3. Altmetrics is only a weak indicator of citations, but the whole point is to look beyond. Different types of sources correlate to different extents.
  4. Don't take all altmetrics indicators as one lump, there are many different flavours of research impact.
  5. When you have an indicator and you tie it to incentives, it immediately corrupts the indicator. While he doesn't believe there is massive gaming of altmetrics there is an element of this with some people. It's human nature.
  6. The top 5% of altmetric scores are not what you expect.The most popular paper is a psychological analysis of the characters in Winnie the Pooh.
  7. Peer review is a scary place. Scientists and researchers can be pretty nasty! Comments can be used in a different (more negative) way than expected. But that is not necessarily a bad thing.
Mike Taylor believes we are approaching a revolution rather than an evolution. What we have at the moment is a collision of varying different worlds because the value of interest in metrics is increasing. What makes for great metrics, and how do we talk about them? Do we want the one-size-fits-all approach? We have data and metrics and in between those two things there is theory, formulae, statistics and analysis. Within the gap between the two things there are a lot of political issues. 

Taylor reflected on the economies of attention (or not) and how you assess if people are engaged. With an audience, when hands go up, you know they are paying attention, but no hands doesn't mean they aren't. Metrics so far are specialist, complex, based on 50 years of research, are mostly bibliometrics/citation based and much is proprietary. The implications for changing nature of metrics: are: as metrics are taken more seriously by institutions, the value of them will increase. As the value increases, we need to be more aware of them. As a scholarly community we need to increase awareness about them. Awareness implies critical engagement, mathematics, language, relevance, openness, agreement, golds standards, and community leadership.

Mike Taylor
We are experiencing a collision of worlds. Terms like 'H-Index' are hard to understand, but are well defined. Terms like 'social impact' sound as if they're well defined, but aren't. There are particular problems about the 'community' being rather diverse. There are multiple stakeholders (funders, academics, publishers, start-ups, governments, quangos), international perspectives and varying cultures (from fifty years of research to a start-up). 

Taylor suggested an example metric - 'internationalism'. Measures could include: how well an academic's work is used internationally; how well that academic works; through readership data; citation analysis (cited, citing); co-authorship; funding data (e.g. FundRef); conference invitations e.g. ORCID; guest professorships; text-analysis of content.

Taylor doesn't think metrics is a place where publishers will have the same kind of impact that they might of 30 years ago. He said to expect to see more mixed metrics with qualitative and quantitative work. Taylor concluded that metrics are being taken more seriously (being used in funding decisions). Many stakeholders and communities are converging. 

Big data + cloud computing + APIs + openness = explosive growth in metrics. 

It is a burgeoning research field in its early days. Publishers need to be part of the conversation. We need to enable community leadership and facilitate decision making.

Wednesday 10 September 2014

ALPSP Awards for Innovation in Publishing - the finalists

ReadCube Connect
The final lightning session from the first day of the ALPSP International Conference showcased the finalists for this year's Awards for Innovation in Publishing sponsored by Publishing Technology.

bioRxiv from Cold Spring Harbor Laboratory Press

The preprint server for biology, operated by Cold Spring Harbor Laboratory, a research and educational institution.

Edifix from Inera Inc.

Edifix is a web service that copyedits, corrects, and links bibliographic references in a number of formats.

Frontiers open science platform

Frontiers is a community-rooted open-access publisher and research network that empowers researchers to take charge of publishing and builds online tools to review, evaluate and disseminate science.

IOP ebooks™ from IOP Publishing

A brand new book programme that brings together innovative digital publishing with leading voices from across physics to create the essential collection of physics books for a digital world.

JournalGuide from Research Square

JournalGuide brings all sources of data together in one place to give authors a simple way to choose the best journal for their research.

ReadCube Connect from Labtiva Inc

ReadCube Connect is an HTML5-powered interactive PDF viewer that seamlessly integrates into your article pages, keeping readers onsite and connected.

Rightslink for Open Access

RightsLink for Open Access from the Copyright Clearance Center

RightsLink® for Open Access streamlines the entire author fee transaction by seamlessly integrating with editorial and production workflows.

The Awards will be announced tomorrow night at the conference dinner. Follow #alpspawards on Twitter for the results!

Customers as competitors

Customers as competitors? Anderson, Taylor-Roe and Horova reflect.
The first plenary at the ALPSP International Conference 2014 focused on increased competition from the least likely sources - our customers - with the advent of digital publishing lowering barriers to entry.

Rick Anderson, Associate Dean for Scholarly Resources & Collections at the Marriott Library, University of Utah (also known as a Scholarly Kitchen Chef) chaired a panel comprising Jill Taylor-Roe, Deputy Librarian at Newcastle University Library, Tony Horova, Associate University Librarian at the University of Ottawa and Graham Stone, Information Resources Manager at the University of Huddersfield.

The University of Ottawa is the world's largest bilingual university and the Press and library have a close relationship at management and editorial level. They generate $300,000 sales each year with a simultaneous print and digital programme. Tony Horova shared the interim results of a research project they ran to track results of a Gold OA partnership. The OA partnership was between the University of Ottawa Press and the library was based around shared goals to improve sustainable dissemination of scholarly research.

The Open Access Funding Partnership is a three year agreement to support gold OA with CC licence for new monographs. The library subsidises a maximum of three titles per annum with a $10,000 subvention per title to a max of $30k per year. They have targeted titles of core contemporary social relevance. It is a three year project with the goal of assessing sales and dissemination so they can understand what it will mean for their future programme. Horova shared the results to date. Interestingly, including actual sales.



Where do they go from here? They are one of four university presses in Canada to have embraced OA and intend to remain on the cutting edge. they are assessing the project/consultation process and determining how to further incorporate OA into business model and strategic direction of the Press while discussing financial implications with the library.

Jill Taylor-Roe reflected on the ups and downs of relationships between librarians and publishers. How do we respond to change? We are in the midst of the most disruptive period in scholarly communications. The only real certainty for all of us is that more change will come. To survive and thrive you need to change and adapt.

One major change that publishers have to engage with is the involvement in research publishing decision of managers within an institution. When you come up against financial directors as agents of change, they become a significant influence. This is a different world, they were never involved before. They will ask lots of questions around why there are payments for fees, pages, illustrations etc. He who pays the piper calls the tune.

It is time to change and recalibrate scholarly communication models. Need to put each of our skill sets together to face this new world. In some instances there will be competition from university presses. It is good that the dialogue has been opened, but she is keen not to polarise the discussion.

Graham Stone spoke about the potential impact of open access repositories and library scholarly publishing on 'traditional' publishing models. He asked if we are not missing the point a little bit? Ultimately, what is more important? Is it the usage on your journal platform or is the actual impact of the research. He would argue that the latter sells content. Repositories that multiply access points help increase readership and impact. Repositories are not all that bad. They may well be helping.
Graham Stone steps up for the debate

Stealing your lunch? No. Gold OA in the repository? We paid for it. Green? We have an agreement to access it. If it is hybrid version, we're not only giving you lunch, but also letting you having your cake and eating it as we link to the publisher platform.

Don't waste your money on fancy sites that don't work on mobile. Researchers just require stuff. The PIRUS project from a few years ago where the evidence showed they drove usage. A more recent initiative is the IRUS-UK and Repositories project. They are adding value by promoting where citations are and building awareness internationally.

There is a lot going on in North America who are ahead with scholarly publishing in the library. Amherst, University of Huddersfield, Ubiquity Press are just some examples. They have an eye for growing the author pool, particularly with young researchers who may struggle to get published elsewhere. Academic publishing is a professional industry and it has to adapt to changes in scholarly practice.

Follow the ALPSP International Conference conversation on Twitter via #alpsp14.

Innovation and its place in the changing scholarly publishing landscape

Amy Brand from Digital Science takes questions
Amy Brand opened the ALPSP International Conference 2014 with a keynote reflecting on innovation, how its main function is to advance research, and how vital it is for publishers to participate in the linked information landscape.

There are crucial changes in academic publishing that are directed towards the challenges of new efficiencies in a data publishing environment. It is simply no longer good enough to just read the text. Researchers need to get behind the curtain or under the hood. Who owns what? Institution, funder, publishers? The shifts in landscape are leading to land grabs. There are institutional resources for management alongside publishers ones. But the good news is that there are many new opportunities for publishers to develop new services.

The act of creating something entirely new is an act of innovation. Brand was inspired by her experience at MIT Press. In the late 90s they experimented with open access monographs and were very successful at it. Another MIT  project CogNet was one of the first online tools for researchers. It was very exciting to work on it and proved to be one of the projects that drew her away from books and on to CrossRef.

One of the principle aims of Digital Science is to work smart in order to discover more. They work with researchers, librarians, academic administrators, funders and publishers, who all want to enhance their own platforms. So they spend a lot of time looking at workflows between all these stakeholders. They have built a portfolio of nine different companies and provide supporting tools for every part of the research life cycle.

Innovation sticks when it addresses a specific need. Pain not necessity is the mother of invention. Brand's check list for publishers comprises six 'pain points'.

Paint point number 1: 'I want a smarter way to manage my own record of scholarship.'

Researchers want a one stop interface that manages all aspects from activity reports, personal website, CV grant applications, institutional repository and lab website. Persistent identifiers are one way to help and what ORCID is trying to achieve (although it has not moved as quickly as they would've liked). ORCID identifiers are not all that sexy, but if everyone used them it would dramatically improve the academic world.

Pain point 2: 'I need better ways to manage and share my research data and other outputs'

Figshare allows publishers to host large amount of data and articles with no impact on your infrastructure. Brand believes working in partnership with figshare is a no brainer for publishers.

Pain point 3: 'I'm finding it impossible to keep up with the relevant literature in my field.'

Getting information from the internet is like trying to drink from a fire hose. Sophisticated filtering and recommendation services should be more closely integrated into the publishers' platforms. And this is what ReadCube tries to do: personalised recommendations best on researcher's libraries.

Pain point 4: 'I want to become more efficient at finding collaborators and funding opportunities.'

There are various services and systems available that can help publishers identify candidate peer reviewers. One example, Uber Research, brings in a wider pool of reviewers from a database. They can be filtered on expertise and conflict of interest. This is a real example of linked data that presents an opportunity for publishers to improve how they engage with the peer review process and minimise reviewer fatigue.

Pain point 5: 'Pay walls keep me from accessing needed resources and from disseminating my work as widely as possible.'

How can publishers be new partners for innovative access models? What about differentiated access? As the conference is in Heathrow, think about how airlines break down costs for carry on baggage, early check in, etc. With ReadCube, instead of facing paywalls, the institution has patron-driven but paid access to articles in the library. Brand urged publishers to sell more granular bits of information on the book side as well.

Pain point 6: 'Academic incentives and evaluation norms exert too much control over my research and publication choices.'

The traditional paradigms of where you publish (high Impact Factor, prestige monograph publishers) can constrain the direction of your research. The Journal of Statistical Software is a great example: they provide reproducible code and software tools for readers. Altmetric helps researchers read around the subject and authors can check engagement downstream. It can provide a tremendous force for change as readers start their own little revolutions on what to read.

A more radical way is turning the idea of authors on its head. The relationship between authorship, invention and credit is broken. The Harvard-Wellcome draft taxonomy is helping to drive credit for discovery which in turn can have a huge impact on a person's life and career. (Brand flagged the open invitation to CASRAI-NISO contributor role taxonomy review circle.)

Brand finished with a series of questions she feels publishers should be asking themselves to innovate effectively.
  1. Are you using and contributing linked data?
  2. Have you implemented ORCID IDs in your workflow?
  3. How are you increasing engagement with your content?
  4. Do your journals support data sharing?
  5. Are you displaying article-level metrics?
  6. Do you offer differentiated access options for your users?
  7. How do you currently capture contribution, and how can we collectively improve the tracking of research credit?

Amy Brand is Vice President of Academic & Research Relations and North America for Digital Science.  

Monday 8 September 2014

Amy Brand talks innovation: not just introducing something new, but also a well-articulated sense of purpose

We are delighted that Amy Brand, Vice President of Academic & Research Relations and North America for Digital Science, is our keynote speaker at the ALPSP International Conference on Wednesday. Amy took some time out of her schedule to tell us a bit more about her work and what she thinks innovation is really about.

Tell us about yourself and Digital Science.

"I feel extremely fortunate to be in the midst of a long and varied career immersed in many different facets of scientific and scholarly communications: as an MIT-trained researcher in linguistics and cognitive science, executive editor at The MIT Press, Director of Business and Product Development at CrossRef, manager of the Office for Scholarly Communication and then Assistant Provost for Faculty Appointments and Information at Harvard, founding member of ORCID’s board of directors, and now as VP at Digital Science, where I manage U.S. operations and cultivate institutional partnerships.

For those of you who don’t know us yet, Digital Science invests in and incubates academic start-ups that provide research information software – software that accelerates scientific and scholarly research, both by facilitating aspects of the research cycle directly and by facilitating the management of the research process.

My ALPSP keynote next week will focus on where the innovation and change in scholarly communications is coming from, where I see it going, and how smaller scholarly and professional publishers can participate and benefit."

What does innovation mean to you?

"Innovation simply means making or introducing something new. In our world that tends to translate as creating new technologies to stay competitive. But innovation has also come to mean a way of working - towards a well-articulated sense of purpose, within a work environment that embraces experimentation and risk taking. I believe that innovation in publishing can make research better, and more productive researchers means more new knowledge with which to address the big problems in our world. We are inventing the future of scholarly communication to meet the evolving needs of scholars - in particular, to make research more efficient with tools that facilitate discovery, accessibility, attribution and reproducibility."

What do you think are the key drivers of innovation in scholarly communications at the moment?

"At a high level, I see our community innovating to create new efficiencies within today's complex linked information environment. But when you drill down, you can identify a number of specific researcher pain points that are driving the invention of new tools and models to address frustrating inefficiencies."

How is that impacting on the traditional industry?

"What it means to be a scholarly publisher and stay competitive has forever changed. Content may still rule, but what we mean by content and the scholarly conversation has expanded significantly. As a consumer of scholarly information, it is no longer enough to simply read the text. I expect to be able to look behind the curtain at data, code, other media, and - downstream - how other people are reacting to the work in real-time. There are tremendous opportunities for publishers that can grow accordingly, and extend their own services into other aspects of the scholarly communication ecosystem."

How  does Digital Science ‘do’ innovation?

"We have a clear vision and a well-defined approach to innovation. We aim to provide innovative tools that support every stage of the research life cycle, and we do so by investing in best-in-class solutions. Most of the start-up companies in our portfolio were conceived and founded by academics innovating to address a major challenge in their own workflows, whether during the funding process, in the lab, managing data, or in the writing and publication process itself."

And finally, what do you hope the delegates will get out of your talk at the conference?

"I hope the audience goes away with a renewed sense of understanding that when we innovate in publishing, we do so to advance research itself, and that the way to stay on the cutting edge today is to participate fully in the linked information landscape. Ultimately, whether you're a publisher, a librarian, a researcher or a funder, we’re all in the scholarly communication enterprise together, working towards the creation of new knowledge."

The ALPSP International Conference is on Wednesday 10 - Friday 12 September at the Park Inn Heathrow, London. Follow the conversation on Twitter via #alpsp14 or read highlights from the sessions here on the ALPSP blog.

Wednesday 3 September 2014

ALPSP Awards spotlight on... JournalGuide, a free online tool that helps researchers find the best journal for their paper

Keith Collier, VP of Business Development
With only a few days left before the ALPSP Awards for Innovation in Publishing are announced at the conference, this is the penultimate post in our series profiling the finalists.

Keith Collier, Vice President of Business Development at Research Square, answers a few questions about JournalGuide.

1. Tell us a bit about your company

KC: Research Square is focused on helping researchers to share their discoveries. In addition to JournalGuide, our family of brands also includes American Journal Experts (AJE) and Rubriq. The company started with AJE, which since 2004 has provided editing, translation, figures and formatting services to our company based in Durham, NC. Our team includes a combination of PhD researchers, software developers, and publishing industry veterans.

2. What is the project that you submitted for the Awards?

KC: JournalGuide, a free online tool that helps researchers find the best journal for their paper. As an independent third-party, JournalGuide combines data from publishers, industry indices, and the authors themselves to create a rich, unbiased source of information. By bringing all of these sources together in a standardized format, we can provide a trusted place for authors and journals to connect.




3. Tell us more about how it works and the team behind it.

KC: At the core of JournalGuide is data: a complex combination of information from multiple existing databases, information sent directly from publishers and journals, and information entered by individuals (journal editors and the authors themselves). On top of our data is an advanced search algorithm that takes the title and abstract of a paper and finds the most relevant journal matches.  In addition to the search algorithm, we also provide a number of features that allow authors to sort and filter those results. Researchers can easily compare the journal data critical to their decision, including speed to publication, Open Access policies, and citation metrics.

The concept was originally developed within our independent peer review service, Rubriq. As part of the service, we provide customized journal recommendations for each manuscript. We created an internal tool to help our team become more efficient. As the tool evolved, we saw an opportunity to share it with other researchers. Most of the development of JournalGuide came from a partnership of our software developer team and the published PhD researchers who run our Rubriq service.


Sample search results

4. Why do you think it demonstrates publishing innovation?

KC: JournalGuide is not the first or only web tool for finding journals. However, the elements that make it significant and innovative come from its scope and independence.  By not limiting the scope to any particular area of study, publisher, or index, we can make a more valuable tool, standardize data sets for new uses, and support critical cross-discipline connections. By not limiting participation to any single player, we can provide an objective resource that can engage all parts of the scholarly publishing community.

The combination of these two elements puts us in a unique position to create and support new data standards that can be truly industry-wide. By providing a central hub to aggregate, standardize, and provide access to all available data about journals, JournalGuide can enable future innovations and new data insights. It also allows us to tackle broad-ranging issues, such as defining and verifying legitimate vs. predatory journals.  This new “Verified” designation will be going live around the same time as the ALPSP conference.




5. What are your plans for the future?

KC: We are continuing to build on our core concept of a platform for centralized authoritative data. In addition to continuing to expand our data sources, we have plans to expand our own metrics such as a “Verified” journal designation. Our approach for this new metric was a source of some internal debate (see our post on “Whitelist vs. Blacklist” here) but we feel confident in our new system launching in September. This new status designation in our database will help authors avoid predatory journals.

Another key project for us is the development of consistent and standardized open access terminology across areas of study. Mandates on research accessibility increase the importance of Open Access, Article Publication Charges, embargo periods, licensing, and restrictions on which version(s) of a paper can or must be archived when selecting a journal.  We are working with funders, universities, and libraries to understand this new environment and to help aggregate, standardize, and categorize data so that authors can most effectively use it to ensure compliance.

Keith Collier is Vice President of Business Development at Research Square, the home of JournalGuide, Rubriq and AJE.

The winners will be announced at the ALPSP International Conference Wednesday 10 - Friday 12 September, Park Inn Heathrow, London. Follow the conversation via #alpsp14 and #alpspawards on Twitter.