Thursday 27 September 2012

David Sommer: COUNTER - New Measures for Scholarly Publishing

David Sommer is a consultant working with a range of publishing clients to grow products and businesses. He has also been a contributor to the project COUNTER and completed the morning session at the To Measure or Not to Measure - Driving Usage seminar.

He provided an overview of the latest COUNTER Release 4. The main objectives for the update were to provide a single, unified Code covering all e-resources, including journals, databases, books, reference works, multimedia, content, etc. They wanted to improve the database reports and reporting of archive usage. The update will enable the reporting of mobile usage separately, expand the categories of 'Access Denied' covered, improve the application of XML and SUSHI in the design of the usage reports, and collect metadata that facilitates the linking of usage statistics to other datasets such as subscription information.

The main features of the update are:
  • a single integrated database
  • expanded list of definitions including gold OA, Multimedia Full Content Univ, Record View etc
  • improved database report that includes reporting of results clicks and record views in addition to searches (sessions removed)
  • enhancement of the SUSHI (Standardised Usage Statistics Harvesting Initiative) protocol designed to facilitate its implementation by vendors and its use by librarians
  • a requirement that Institutional Identifies, Journal DOI and Book DOI be included in the usage reports, to facilitate not only the management of usage data, but also the linking of usage data to other data relevant to collections of online content
  • requirement that usage of Gold OA articles within journals be reported separately in a new report - Journal Report 1 GOA
  • a requirement that Journal Report 5 must be provided (archive report, broken down by year so you can understand how much you are paying and the journal usage)
  • Modified Database Reports in which the previous requirement to report Session counts has been dropped, and new requirements, to report Record Views and Result Clicks, have been added. (Database Report 3 has also been renamed Platform Report 1)
  • New Multimedia Report 1, which covers the usage of non-textual multimedia resources audio, video and images, reporting number of successful requests for full multimedia content units - optional
  • new optional reports covering usage on mobile devices
  • description of the relative advantages of logfiles and page tags as the basis for counting online usage
  • flexibility in the usage reporting period that allows customers to specify a date range for their usage reports
Sommers posed an interesting question: what is a mobile device? They have used the WURFL list to define. The timetable for implementation includes a deadline date of 31st December.

Sommer then provided a useful background to Usage Factor (UF). It is designed to be a complement to citation-based measures. While Impact Factors, based on citation data, have become generally accepted as a valid measure of the impact and status of scholarly journals and they are widely used by publishers, authors, funding agencies and librarians, there are misgivings about an over reliance on them. The idea is not to try to kill them off, but to provide other measures to use alongside, particularly as Impact Factors don't work so well for non-STM disciplines.

Usage Factor provides a new perspective: a complementary measure that will compensate for the weakness of Impact Factors in sereral important ways:
  • UFs will be available for much larger number of journals
  • coverage of all fields of scholarship that have online journals
  • impact of practitioner-oriented journals better reflected in usage
  • authors welcoming this to build their profile.
Four major groups will benefit: authors (especially practitioner-based fields) without reliable global measures; publishers; librarians; and research funding agencies seeking a wider range of credible consistent quantitative measures of value and impact of output of research that they fund.

The aims and objectives of the project have been to assess whether UF will be statistically meaningful, will be accepted, is robust and credible, and to identify what the organisational and economic model will be. They started in 2007-2008 with market research including 29 face to face interviews from across interest groups as well as 155 librarian and 1400 author web survey responses.

Stage two focused on modelling and analysis and involved relevant bodies, publishers and journals. The recommendations included:
  • UF should be calculated using median rather than the arithmetic mean
  • range of UF should ideally be published for each journal: comprehensive UF plus supplementary factors for selected items
  • UF should be published as integers - no decimal places
  • UF should be published with appropriate confidence levels around the average to guide their interpretation
  • UF should be calculated initially on the basis of a maximum usage time window of 24 months
  • UF is not directly comparable across subject groups and should therefore be published and interpreted only within appropriate subject groupings
  • UF should be calculated using a publication window of two years.
There seems to be no reason why ranked lists of journals by usage factor should not gain acceptance. However, small journals and titles with less than 100 downloads per item are unsuitable candidates for UF as they are likely to be unreliable.

Stage three involves testing. The usage data will be used to investigate the following:
  • effect of using online publication data versus date of first successful request on UF
  • calculation and testing UF for subject fields not covered
  • test further gaming scenarios and assess how these can be detected
  • test stability of UF for low UF journals and confirm level below which it shouldn't be provided.
This will deliver a Code of Practice which will include definitions, methodology for calculation, specifications for reporting and independent auditing as well as a description of the role of the Central Registry for UF and funding model.

David closed with a summary of PIRUS2 whose mission is to develop a global standard to enable recording, reporting and consolidation of online usage statstics for individual journal articles hosted by Institutional Repositories. Further information is available online.

3 comments:

  1. Thanks ALPSP for a great seminar. More details on COUNTER Release 4, Usage Factor and Pirus available at www.projectcounter.org

    ReplyDelete
  2. When there are new perspectives, it is always worth taking with gratitude and joy, as this is a good chance for everyone.

    ReplyDelete