Showing posts with label Euan Adie. Show all posts
Showing posts with label Euan Adie. Show all posts

Monday, 17 August 2015

ALPSP Awards Spotlight on… Bookmetrix from Altmetric and Springer

Martijn Roelandse
In the fifth of the ALPSP Awards for Innovation in Publishing finalists' posts Martijn Roelandse Springer's Manager for Publishing Innovation and Euan Adie, Founder of Altmetric and fellow co-developer of Bookmetrix, talk about how they worked to launch the product.

Tell us a bit about both companies and how this collaboration came about.

As a large publisher of STM content, with a current catalogue of over 194,000 books, Springer is always keen to offer further services for our authors and editors - in particular in relation to the insight into the attention, use and impact of their titles.

Altmetric are a data science company based in London. Founded in 2011, Altmetric made it their mission to track and analyse the online activity around scholarly literature, and today supply data via their distinctive ‘donut’ badges and platforms to many of the world’s leading publishers, funders and institutions.

Euan Adie
The idea for this project was originally conceived at Springer, who wanted to find new ways to offer added value and additional feedback to the authors and readers of their extensive book content.

In addition, Springer also wanted to offer their editorial and marketing teams an easier way of tracking the reach and impact of their publishing portfolio, and were keen to offer further insight than download and citation counts alone would be able to provide.

Having first established a relationship Altmetric in 2011 with the adoption of the Altmetric API for all of their journal articles, Springer were familiar with the Altmetric team and felt there would be a shared approach and understanding of what the project was trying to achieve.


What is the project that you submitted for the Awards?

We submitted Bookmetrix - the first platform of its kind to help authors, editors, publishers and readers to track the broader impacts of a book or chapter once it’s published. The project encompassed two parts; public-facing details pages, which are now accessible via the metrics displayed on every SpringerLink book page, and the Bookmetrix search interface - a database which Springer staff can use to browse and filter the metrics across their book portfolio.


Tell us more about how it works and the team behind it.

Bookmetrix was built as a partnership between Altmetric and Springer - regular meetings between the two groups (comprising project leaders, product managers, and developers) ensured that we agreed goals and concept early on. The aim was to offer authors and readers a totally new way to see and understand the impact of their work and to help set a new standard for monitoring and reporting the activity surrounding a book post publication. To achieve this, we worked to pull in mentions and other online activity relating to each book or chapter from a variety of different sources - including downloads, citations, book reviews, public policy, mainstream media coverage and social media shares.

The data was then surfaced via the details pages - where users can see a summary of the mentions of the whole book, and dig down to view the mentions for each chapter and the original comments from each source. The details pages can be accessed via the SpringerLink platform and via the search interface (by Springer staff).

Why do you think it demonstrates publishing innovation?

Bookmetrix is the first platform of it’s kind to bring together such a valuable mixture of traditional and non-traditional indicators of broader impact and influence for books and individual chapters. Such measures are increasingly important for authors who are asked by funders or institutional management to demonstrate the influence of their work - and are particularly valuable for those who do not chose to publish journal articles (which often bring the most credit) as their main form of research output.

As well as offering this additional insight, Bookmetrix has demonstrated the value that can be found in publishers combining their objectives with the technical and domain expertise of an external partner.

What are your plans for the future?

The scope of Bookmetrix is wider than existing initiatives in the market: it covers substantially more books and goes beyond pure citation data. Bookmetrix fits in Springer’s ambition to drive more industry-wide initiatives to support the work of authors and researchers.


The winner of the ALPSP Awards for Innovation in Publishing, sponsored by Publishing Technology, will be announced at the ALPSP Conference. Book now to secure your place.

Thursday, 11 September 2014

Metrics and more

Melinda Kenneway on metrics
Publication metrics are part of a much bigger picture. Where resources are restricted and there is a lot of competition, metrics become more essential to help prioritise and direct funding. The 'Metrics and more' 2014 conference session was chaired by Melinda Kenneway, Director at TBI Communications and Co-founder of Kudos. The panel comprised Mike Taylor, Research Specialist at Elsevier Labs, Euan Adie, Founder of Altmetric and Graham Woodward, Associate Marketing Director at Wiley. Kenneway opened by observing that as an industry we need to consider new types of metrics.

Publication performance metrics include:
  • Anti-impact factor: DORA
  • Rise of article level metrics
  • introduction of almetrics
  • New units of publishing: data/images/blogs
  • Pre-publication scoring (Peerage of Science etc)
  • Post-publication scoring (assessment, ranking etc)
  • Tools for institutional assessment

Researcher performance metrics include:

  • Publication output
  • Publication impact
  • Reputation/influence scoring systems
  • Funding
  • Other income (e.g. patents)
  • Affiliations (institutional reputation)
  • Esteem factors
  • Membership of societies/editorial boards etc
  • Conference activity
  • Awards and prizes

Institutional performance metrics include:

  • University ranking systems
  • Publication impact metrics
  • STAR/Snowball metrics
  • Research leaders and career progression
  • Patents, technologies, products, devices
  • Uptake of research

Graham Woodward, Associate Marketing Director at Wiley, provided an overview of a trial of altmetrics on a selection of six titles. On one article, after a few days of having altmetrics on the site, they saw the following results: c. 10,000 click throughs; average time on page over three minutes; over 3,500 tweets; an estimated 5,000 news stories; 200 blog posts; and 32 F1000 recommendations.

Graham Woodward
They asked for user feedback on the trial and the 50 responses provided a small but select snapshot that enabled them to assess the effectiveness of the trial.

Were the article metrics supplied on the paper useful? 91% said yes. What were the top three most useful metrics? Traditional news outlets, number of readers and blog posts. 77% of respondents felt the experience enhanced the journal.

Half of respondents said they were more likely to submit a paper to the journal. 87% used the metrics to gauge the overall popularity of the article, 77% to discover and network with researchers who are interested in the same area of their work and 66% to understand significance of paper in scientific discipline.

What happened next? The completion of six journal trial was followed by an extension to all OA journals. They have now rolled out metrics across the entire journal portfolio

Euan Adie from Altmetric reflected on the pressures and motivations on researchers. While there is a lot of pressure within labs for young researchers, funders and institutions are increasingly looking for or considering others types of impact, research output and contribution. There is an evaluation gap between funder requirements and measuring impact. That's where altmetrics come in. They take a broader view of impact to help give credit where it is due. HEFCE are doing a review of metrics within institutions at the moment.
Euan Adie

Seven things they've learnt in the past year or so.

  1. Altmetrics means so many things to so many people. But the name doesn't necessarily work. It is complimentary rather than alternative and it is about the data, not just the measure.
  2. It's worked really well for finding where a paper is being talked about where they wouldn't have known before, but also the demographics behind it.
  3. Altmetrics is only a weak indicator of citations, but the whole point is to look beyond. Different types of sources correlate to different extents.
  4. Don't take all altmetrics indicators as one lump, there are many different flavours of research impact.
  5. When you have an indicator and you tie it to incentives, it immediately corrupts the indicator. While he doesn't believe there is massive gaming of altmetrics there is an element of this with some people. It's human nature.
  6. The top 5% of altmetric scores are not what you expect.The most popular paper is a psychological analysis of the characters in Winnie the Pooh.
  7. Peer review is a scary place. Scientists and researchers can be pretty nasty! Comments can be used in a different (more negative) way than expected. But that is not necessarily a bad thing.
Mike Taylor believes we are approaching a revolution rather than an evolution. What we have at the moment is a collision of varying different worlds because the value of interest in metrics is increasing. What makes for great metrics, and how do we talk about them? Do we want the one-size-fits-all approach? We have data and metrics and in between those two things there is theory, formulae, statistics and analysis. Within the gap between the two things there are a lot of political issues. 

Taylor reflected on the economies of attention (or not) and how you assess if people are engaged. With an audience, when hands go up, you know they are paying attention, but no hands doesn't mean they aren't. Metrics so far are specialist, complex, based on 50 years of research, are mostly bibliometrics/citation based and much is proprietary. The implications for changing nature of metrics: are: as metrics are taken more seriously by institutions, the value of them will increase. As the value increases, we need to be more aware of them. As a scholarly community we need to increase awareness about them. Awareness implies critical engagement, mathematics, language, relevance, openness, agreement, golds standards, and community leadership.

Mike Taylor
We are experiencing a collision of worlds. Terms like 'H-Index' are hard to understand, but are well defined. Terms like 'social impact' sound as if they're well defined, but aren't. There are particular problems about the 'community' being rather diverse. There are multiple stakeholders (funders, academics, publishers, start-ups, governments, quangos), international perspectives and varying cultures (from fifty years of research to a start-up). 

Taylor suggested an example metric - 'internationalism'. Measures could include: how well an academic's work is used internationally; how well that academic works; through readership data; citation analysis (cited, citing); co-authorship; funding data (e.g. FundRef); conference invitations e.g. ORCID; guest professorships; text-analysis of content.

Taylor doesn't think metrics is a place where publishers will have the same kind of impact that they might of 30 years ago. He said to expect to see more mixed metrics with qualitative and quantitative work. Taylor concluded that metrics are being taken more seriously (being used in funding decisions). Many stakeholders and communities are converging. 

Big data + cloud computing + APIs + openness = explosive growth in metrics. 

It is a burgeoning research field in its early days. Publishers need to be part of the conversation. We need to enable community leadership and facilitate decision making.