Wednesday, 23 August 2017

Spotlight on Delta Think - shortlisted for the 2017 ALPSP Awards for Innovation in Publishing



In the second of our series of interviews with our ALPSP awards finalists we talk to Ann Michael, President of Delta Think.

·         Tell us a bit about your company

Delta Think is a business and technology consulting firm focused on innovation and growth in scholarly publishers and membership organizations. Founded at the height of the transition from print to digital, our core purpose is helping organizations to manage change. We live at the intersection of content, technology and user experience, and help our clients through the many changes impacting scholarly publishers and membership organizations.

·         What is the project that you submitted for the Awards?

The Delta Think Open Access Data & Analytics Tool  (OA DAT) is a living compilation of industry data, anonymized private data and analysis, which provides a comprehensive view of the OA market.

The idea originated with questions arising during consulting engagements. We found commonality in questions clients asked about OA, with no clear consolidated, reliable data source to address them. We saw an opportunity to serve the industry more effectively by curating such a data set, and supplementing it with analysis, commentary, and visualizations.


·         Tell us more about how it works and the team behind it

The OA DAT includes several levels of analysis and data access to accommodate any organization’s available bandwidth, comfort level with data analytics, and budget. We’ve worked hard to build a consistent view of the market and to make existing silos of data interoperate to deliver new insights, supplementing them with publisher confidential data (de-identified and aggregated).

The tool includes interactive visualizations so users can extract meaning for real-world decision-making, while providing a quick way to tailor the tool to areas of interest with a few mouse clicks. There is a sample interactive visualization and a brief video on our website , and a few static images below.

Additionally, we curate and aggregate OA industry news and combine it with a short analysis to publish Delta Think’s OA News & Views, which is available free of charge with registration.

We launched the Delta Think Open Access Data & Analytics Tool in beta to pre-launch subscribers in December of 2016, with our full launch in January of 2017. The tool is continually updated. To keep the product current and the user base growing, we work with two core data analysts (one with deep database experience), a research associate, product manager, project manager, marketer, and a business development resource
© 2017 Delta Think, Inc. All rights reserved. May not be reused without permission

·         Why do you think it demonstrates publishing innovation?

We had two key motivations in developing the tool: generally increasing industry data proficiency, and specifically supporting organizations as they make data driven strategic and ongoing decisions around Open Access. While several organizations have the bandwidth and budget to develop data analytics expertise, most find using data in decision making aspirational and, at best, episodic. It is not a part of their normal workflow.

On one level, our innovation is pulling disparate information together and continually updating it to provide benchmarks and genuinely novel insights into the Open Access market.

But the innovation goes much deeper: In a single product, we have created the means for any organization to use data in OA decision making on an ongoing basis.
·       Organizations not equipped to interact with the underlying data can read the analysis and benefit from its regular updates. They don’t have to wait a year or two for an analyst report to be refreshed.
·       Organizations looking to interact with the data can manipulate visualizations, narrowing in on elements important to them (e.g., geographic region, subspecialty, year, etc.).
·       Organizations equipped to gather, normalize, and analyze data and knowing the questions they want answered can export OA DAT data, combine it with proprietary data, and complete their own analyses. They can also use OA DAT to save time and effort in data collection, allowing them to focus more on their proprietary analysis.
·        For any of the above, our consulting resources support targeted research or produce custom reports.
The tool is valuable and accessible to the expert or the novice.

·         What are your plans for the future?

We currently focus on STM + Social Science, OA, Journals, as well as not-for-profit and commercial publishers. We plan to extend product coverage to books, open data, and even overall scholarly output (overlaying analytics on top of the usual indexes). We also plan to extend to institutional markets (both librarians and research departments) and other industry players interested in Open Access. Additionally, we are in confidential discussions with many potential data and tech partners.

Our bigger picture goal is to create extensions to the Delta Think Open Access Data Analytics Tool and develop other products or services supporting scholarly publishers, membership organizations, and the multitude of players in the scholarly communications ecosystem to build data and business analytics into their culture and their workflows.

photo Ann MIchaelAnn Michael is President of Delta Think, a business and technology consulting and advisory firm focused on innovation and growth in membership organizations and scholarly publishers. Ann is a Past President of the Society for Scholarly Publishing (SSP), an NFAIS board member, a Board Director at Joule (a Canadian Medical Association company), a frequent organizer and speaker at industry conferences, and is a contributor to the SSP’s blog, The Scholarly Kitchen. 

Twitter: @deltathink @annmichael
LinkedIn: https://www.linkedin.com/in/annmichael
LinkedIn:  https://www.linkedin.com/company/delta-think
Facebook:  facebook.com/deltathink


See the ALPSP Awards for Innovation in Publishing Finalists lightning sessions at our Annual Conference on 13-15 September, where the winners will be announced. 

The ALPSP Awards for Innovation in Publishing 2017 are sponsored by MPS Ltd.
 

Monday, 21 August 2017

Navigating the safe passage through the minefield of predatory publishing

Philip J Purnell and Mohamad Mostafa, Knowledge E

logo Knowledge E
Like many of the world’s scholars, young researchers at Al-Nahrain University in Iraq have been told they need to publish research articles in academic journals in order to progress in their careers. Aware of the low acceptance rates and lengthy publication delays in traditional journals, many turn to relatively recently launched open access journals that market their quick turn-around and low publishing charges. Dr. Haider Sabah Kadhim, head of Microbiology at the university says these young researchers are being duped into paying 100 – 200 USD on the promise of fast-track publication within one week. He has seen countless academics proudly show their published papers only to be told they won’t count because the journal isn’t on an approved list. Dr. Haider expressed his worry about the long term harm this publishing practice could cause to their careers.

The same pressure is felt by the research community across the region, Egyptian Assistant Prof. Hossam El Sayed Donya teaches medical physics at faculty of science King Abdulaziz University in Saudi Arabia. He sees similar challenges and says that some publishers boast of being indexed the top databases like Web of Science, Scopus, and MEDLINE, having Journal Impact Factors and that they assign digital object identifiers (DOIs) to all published articles. Often these promises turn into disappointment when the researcher realises their article cannot be found in the databases, the journal doesn’t really have an impact factor or that the DOI is not deposited in the Crossref database. Prof. Hossam said there is a need for information and guidance in both English and Arabic for early career researchers that teach them the signs to look for in a good journal/publisher and those to avoid.

The dilemma

Modern scholars are coming under increasing pressure to demonstrate their academic productivity, by which their output is determined by the number of research papers they have published and their impact is determined by counting the citations to each researcher’s articles. Indeed, universities and promotion committees often set targets and thresholds for academic progression based on publications and citations. Universities do this because they also operate in an increasingly competitive space and are themselves responding to pressure for a good performance in the international university rankings such as Shanghai, Times Higher Education, QS and others that count publications and citations to the articles published by the entire university faculty.

One enviable mark of a high-quality journal is being indexed in a renowned database such as the Web of Science, Scopus or MEDLINE. An even more elite group of around 10,000 high impact journals are given Journal Impact Factors which are listed each year in the Journal Citation Reports – the JIF is calculated as the ratio of the journal’s citation impact to the volume of research papers it publishes. Millions of researchers are incentivised to publish in ‘Impact Factor journals’ and ambitious scholars are easily enticed into sending their manuscripts to journals that prominently display their Impact Factor. The problem here is that many questionable journals state that they are indexed in such databases when they’re not. Or they announce their ‘Impact Factor’ even when it has not been provided by Clarivate Analytics (formerly ISI and Thomson Reuters), the owner of the Journal Citation Reports and provider of the Journal Impact Factor. Most young academics don’t realise that many of these questions can easily be checked online:

Is the journal indexed in Web of Science?

Is the journal indexed in Scopus?

Is the journal indexed in Medline?

Likewise, once an article has been published in a journal, it is easy to check that the publisher has deposited the digital object identifiers (DOIs) in the Crossref database. Once the DOI has been correctly deposited, then it officially exists and people all over the world can find the article through search engines like Google Scholar and Microsoft Academic, and even more importantly they can accurately cite it pointing fellow academics to the DOI link. Again, a quick check for DOIs is freely available here:

Is this DOI deposited in the Crossref database?

Open Access


The frustration of the academic community by rising subscription prices and a feeling of having to pay twice when research has been funded by public money and then published behind a subscription wall led to the open access movement by which the reader pays no charge to access the results. However, once an article is accepted, the author is usually asked to pay an article processing (or publishing) charge (APC) which often costs hundreds, and can easily run into thousands of dollars, a fee which in many developing countries is covered by the researchers themselves. Under pressure to publish and with little guidance on journal choice, some academics are falling prey to unscrupulous publishers who charge APCs but do not provide a professional publishing service, these have been termed ‘predatory publishers’.

Predatory publishing


Such publishers may exaggerate or misrepresent their services by claiming to be based in a traditional publishing hub while hiding their real location, claiming to provide rigorous peer review but publishing far too quickly for that to be possible or presenting editorial boards of academics who do not know or agree to be listed. Defining a publisher or publication as ‘predatory’ however, is no simple matter and sometimes there is a fine line between acceptable and unacceptable behaviour, e.g. at what point do repeated calls for papers become ‘spam’? Some publishers have found their journals on a predatory publishers blacklist while at the same time being indexed in one of the prestigious databases assumed to be whitelists. One university librarian, Jeffrey Beall maintained a list of ‘probable, possible and potential predatory publishers’ since 2008 (no longer available) and which most recently listed more than 12,000 titles and publishers each included for questionable publication practice based on a list of over 50 criteria.

There is no universally agreed definition of a predatory journal or publisher, indeed nor is there a standard for a ‘high quality journal’. Most people who provide advice on identifying predatory journals start by warning people to watch out for spelling mistakes, typos and grammatical errors on a journal’s website or submission instructions. But equating imperfect English with questionable publication ethics in regions where millions of non-native English speakers are engaged in education and research is itself, an assumption that should be taken in context. Native-level English should not be a pre-requisite for publishing quality research in quality journals, there must be other ways to ensure safe submission of manuscripts and evaluation of journals and publishers.

The Committee on Publication Ethics (COPE) celebrates its 20th anniversary this year and now boasts more than 10,000 members. It has produced a code of conduct and a range of guidelines for authors, editors and peer reviewers. Most serious publishers now adhere to the COPE Code of Conduct and guidelines and this is one of the first things authors should check for.

Think, Check, Submit

logo Think Check Submit
Researchers need to be routinely trained on how to conduct a rudimentary evaluation of a journal. They need to be trained on what to look for and what are the tell-tale signs that should set alarm bells ringing. So, what can the busy researcher do to distinguish good journals from bad?

Several international publishing associations have pooled their resources and launched the Think. Check. Submit. campaign – this was launched during the 2015 meeting of the Association of Learned and Professional Society Publishers, ALPSP. It leads the researcher through the three main steps and includes a check list for researchers to look through before they submit their manuscript to any journal. In the Arab region, we believe that following the Think. Check. Submit. campaign will help the regional research community avoid these pitfalls and publish safely, to view the initiative click here:

Think. Check. Submit: http://thinkchecksubmit.org/

Think. Check. Submit (Arabic): http://knowledgee.com/thinkchecksubmit-ar/

This post was first published on the KnE Blog on 28 February 2017.


Spotlight on INASP - shortlisted for the 2017 ALPSP Awards for Innovation in Publishing

Author Aid LogoIn the third of our series of interviews with ALPSP Awards 2017 Finalists we talk to Andy Nobes, Programme Officer in the Research Development and Support team at INASP the people behind Author AID.


Tell us a bit about the work of INASP


INASP is an international development organization with the vision of research and knowledge at the heart of development. We work with partners in Africa, Latin America and Asia to support individuals and institutions to produce, share and use research and knowledge, which can transform lives.

Most research is published in the global North but many of the world’s most urgent problems are found in the global South. Early-career researchers in low- and middle-income countries face many challenges in communicating their work. These include: lack of familiarity with the global scholarly publishing landscape; lack of experienced colleagues who can advise them about publishing their work; inexperience of scientific writing; and often inexperience of writing in English. Other common challenges for researchers are: understanding plagiarism; choosing a suitable journal; and knowing the basic structure of a paper. They face all these challenges in addition to inherent biases in the global scholarly system, as well as particular challenges for female researchers and those from minority backgrounds or in fragile or conflict states.

Our AuthorAID project was started a decade ago to address these challenges. We work with partner institutions in Africa and Asia to embed research writing and proposal writing training into their curricula. The AuthorAID website also provides a free service for all researchers -  a database of free resources, an online discussion list, and a mentoring and collaboration platform. Over the last decade, our training has evolved from face-to-face training to online and blended courses, and most recently into our recent Massive Online Open Courses (MOOCs).


What is the project that you submitted for the Awards?


In 2015 AuthorAID launched a series of free research-writing MOOCs. These have been run twice-yearly on the Moodle open-source platform and have so far attracted nearly 7,500 researchers from over 100 developing countries. We have also run the course in Spanish in partnership with Latindex, training 3,000 researchers in Latin America. These courses are particularly good at supporting harder-to-reach groups, including female academics and those in fragile and conflict states. Such support is essential for ensuring equity in global research.

The course is six weeks long and covers four topics: literature review, research ethics, writing a paper and publishing in a journal.  Participants learn via text-based lessons, weekly quizzes, a facilitated discussion forum and peer-assessed writing activities, as well as optional short video content.

 

 

Tell us more about how it works and the team behind it


Caroline Koech, an environmental chemist in Kenya participating in the online course & carrying out laboratory analysis
Course participant Caroline Koech from Kenya
The AuthorAID team is small, and the MOOCs are run by just two core staff – I administrate and moderate the course from our Oxford office and my colleague Ravi Murugesan is the lead facilitator and Moodle expert, who is based in India. In order to manage groups of 1000+ course participants we rely on our team of volunteer ‘guest facilitators’. These are drawn from our global network of mentors and partner institutions, and we also promote ‘star’ participants to be facilitators on subsequent courses.

This facilitation model not only makes it possible to cater for large numbers of participants, across different time zones, but it also helps to pass on the skills to reduce the reliance on INASP. This is a key part of our sustainable development model.


Why do you think it demonstrates publishing innovation?


The innovation in our project is not so much in the technology – after all, MOOCs are becoming increasingly common in global education. Rather, it is the context, community and impact of the AuthorAID MOOCs that are uniquely innovative. Our courses are built on 10 years’ experience of training early-career researchers in developing countries, and our content is aimed at overcoming many of the practical problems that they face in publishing their research, such as writing skills and understanding the publishing process. The lessons are pitched at a basic, introductory level, presented in simple English.

The material is also designed to be used in low-bandwidth environments, as many of our audience have problems with access to reliable internet. The lessons are interactive text-based lessons rather than high-bandwidth video lectures which many participants find difficult to watch.  The strength of the course is in the interactive content and social interaction - the discussion forums, which are energised by our team of guest facilitators, are particularly active and we hear that many researchers make friends and research collaborators on the course.

The completion rate for the courses is around 50%, which is significantly higher than average MOOCs. Our courses include a high percentage (45%) of women researchers and we also reach participants in fragile and conflict-affected states, for example Afghanistan, Somalia, Iraq and Yemen.

Follow-up surveys have found that at least 34% of course participants have published papers after the course. Feedback has also shown increased confidence to publish, and increased awareness of important ethical issues like plagiarism and ‘predatory’ journals. We hear again and again from researchers about the difference that participating in an AuthorAID research-writi
ng MOOC has made to them, their academic careers, their colleagues, and their ability to share their research findings clearly with a wider audience.


Global map showing the geographical spread of the course and the concentration in developing countries.
Global map showing the geographical spread of the course and the concentration in developing countries.

What are your plans for the future?


We want to improve both the scope and delivery of our MOOCs – for example, we are translating the course into French, and developing a social science version. We are also creating additional content that is important to researchers – for example training in communicating research to the public, practitioners and policymakers. It’s becoming increasingly important to make online courses mobile-friendly, particularly in developing countries, and we are further developing content to be fully mobile optimised, including downloadable lessons and exercises that can be completed offline.

In the long term, INASP is currently looking at opportunities for further funding for this work to ensure that more early-career researchers in the developing world can continue to receive the training and support that they need so that they – and their research – can contribute to global research conversations.

Our ultimate goal is to make this training truly sustainable – not only by growing our network of online facilitators, but also working with some of our partner institutions around the world to run online courses in their own institutions. This training has already started with partners in Sri Lanka, Tanzania and Vietnam. We have also helped grant winners to trial their own mini-MOOC version of the course.

Photo of Andy Nobes
Andy Nobes
Andy Nobes is Programme Officer in the Research Development and Support Team at INASP. This involves the management of the AuthorAID website, and the development of its online mentoring scheme and community forums. 
Before joining INASP, he worked for an academic publisher in journal e-marketing and library marketing.



Twitter:  @INASPinfo and  @authoraid
Blog: blog.inasp.info    
Facebook: inasp.info   
YouTube: INASP   

See the ALPSP Awards for Innovation in Publishing Finalists lightning sessions at our Annual Conference on 13-15 September, where the winners will be announced.