Elsevier journals — some facts

Author: Timothy Gower
Blogpost April 24, 2014

Comment: This long blog post discusses the author’s attempts, successful in many cases, to obtain the costs of Elsevier journal subscriptions at the UK Russell Group of universities. It includes some amusing detailed correspondence with JISC and the universities. Also  related discussion around APCs and their impact on subscription costs, Elsevier costs in some US universities, Brazil. Also in the post and related comments are some useful data sources and related analysis.

Introduction: A little over two years ago, the Cost of Knowledge boycott of Elsevier journals began. Initially, it seemed to be highly successful, with the number of signatories rapidly reaching 10,000 and including some very high-profile researchers, and Elsevier making a number of concessions, such as dropping support for the Research Works Act and making papers over four years old from several mathematics journals freely available online. It has also contributed to an increased awareness of the issues related to high journal prices and the locking up of articles behind paywalls….

I  have come to the conclusion that if it is not possible to bring about a rapid change to the current system, then the next best thing to do, which has the advantage of being a lot easier, is to obtain as much information as possible about it. Part of the problem with trying to explain what is wrong with the system is that there are many highly relevant factual questions to which we do not yet have reliable answers.

Elsevier journals — some facts

Can Microsoft Academic help to assess the citation impact of academic books?

Authors: Kousha K & Thelwall M

Comment: This article examines the comparison of coverage and citations by Microsoft Academic (MA) with the Book Citation Index (BKCI) and Google Scholar (GS). It showed that, while MA’s coverage for books is still not comprehensive, it is able to find more citations in some fields than the other two sources. In particular, it has greater coverage than BKCI for some Arts & Humanities fields (though in general it is still biased towards the technical fields). MA also seems less sensitive to book editions. MA’s comparison with GS gave mixed results, with one better than the other in different fields, suggesting them as having partly complementary coverage.

Abstract: Despite recent evidence that Microsoft Academic is an extensive source of citation counts for journal articles, it is not known if the same is true for academic books. This paper fills this gap by comparing citations to 16,463 books from 2013-2016 in the Book Citation Index (BKCI) against automatically extracted citations from Microsoft Academic and Google Books in 17 fields. About 60% of the BKCI books had records in Microsoft Academic, varying by year and field. Citation counts from Microsoft Academic were 1.5 to 3.6 times higher than from BKCI in nine subject areas across all years for books indexed by both. Microsoft Academic found more citations than BKCI because it indexes more scholarly publications and combines citations to different editions and chapters. In contrast, BKCI only found more citations than Microsoft Academic for books in three fields from 2013-2014. Microsoft Academic also found more citations than Google Books in six fields for all years. Thus, Microsoft Academic may be a useful source for the impact assessment of books when comprehensive coverage is not essential.

Kousha K, Thelwall M (2018) Can Microsoft Academic help to assess the citation impact of academic books? arXiv.org: arXiv:1808.01474v1.

Source: Can Microsoft Academic help to assess the citation impact of academic books?

Citation analysis with microsoft academic

Authors: Hug SE, Ochsner M & Brandle MP

Comment: This article compares the citation analyses between Microsoft Academic (MA) and Scopus. This was compared via the output of three selected researchers. The results showed uniformity across MA and Scopus. Some limitations to MA were also pointed out.

Abstract: We explore if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examine the Academic Knowledge API (AK API), an interface to access MA data, and compare it to Google Scholar (GS). Second, we perform a comparative citation analysis of researchers by normalizing data from MA and Scopus. We find that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving frequency distributions of citations. We consider these features to be a major advantage of MA over GS. However, we identify four main limitations regarding the available metadata. First, MA does not provide the document type of a publication. Second, the “fields of study” are dynamic, too specific and field hierarchies are incoherent. Third, some publications are assigned to incorrect years. Fourth, the metadata of some publications did not include all authors. Nevertheless, we show that an average-based indicator (i.e. the journal normalized citation score; JNCS) as well as a distribution-based indicator (i.e. percentile rank classes; PR classes) can be calculated with relative ease using MA. Hence, normalization of citation counts is feasible with MA. The citation analyses in MA and Scopus yield uniform results. The JNCS and the PR classes are similar in both databases, and, as a consequence, the evaluation of the researchers’ publication impact is congruent in MA and Scopus. Given the fast development in the last year, we postulate that MA has the potential to be used for full-fledged bibliometric analyses.

Hug, S.E., Ochsner, M. & Brändle, M.P. (2017) Citation analysis with microsoft academic. Scientometrics 111: 371. https://doi.org/10.1007/s11192-017-2247-8

Source: Citation analysis with microsoft academic

Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

Authors: Tsay M-y, Wu T-l & Tseng L-l

Comment: This article compares several open access search engines (i.e., Google Scholar (GS), Microsoft Academic (MSA), OAIster, OpenDOAR, arXiv.org and Astrophysics Data System (ADS)) using publications of Nobel Laureates for Physics from 2001 to 2013. A short literature on comparing search engines is given. Both internal and external overlaps are studied. At the time of this work, GS had the highest coverage of this sample, but had a very high percentage of internal overlap (>92%). It actually covers all items in other sources, except for MSA. ADS and MSA both had coverage just below GS, with ADS having the lowest internal overlap of the three (just slightly higher than arXiv.org, which had 0 internal overlap).

Abstract: This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

Tsay M-y, Wu T-l, Tseng L-l (2017) Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources. PLoS ONE 12(12): e0189751. https://doi.org/10.1371/journal.pone.0189751

Source: Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

Microsoft Academic is one year old: the Phoenix is ready to leave the nest

Authors: Harzing, AW. & Alakangas, S.

Comment: This is the third of a series of articles, by the first author, investigating the relative citation and publication coverage of Microsoft Academic (MA) within its first year of (re-)launch. Although the studies were of relatively small scale (citation record of 1 and 145 academics), they provided strong evidence for the advantages of MA over other databases. In particular, it possesses high coverage like Google Scholar and, at the same time, structured metadata like in Scopus and Web of Science. These, together with its fast growth, make MA an excellent alternative for bibliometrics and scientometrics studies.

Abstract: We investigate the coverage of Microsoft Academic (MA) just over a year after its re-launch. First, we provide a detailed comparison for the first author’s record across the four major data sources: Google Scholar (GS), MA, Scopus and Web of Science (WoS) and show that for the most important academic publications, journal articles and books, GS and MA display very similar publication and citation coverage, leaving both Scopus and WoS far behind, especially in terms of citation counts. A second, large scale, comparison for 145 academics across the five main disciplinary areas confirms that citation coverage for GS and MA is quite similar for four of the five disciplines. MA citation coverage in the Humanities is still substantially lower than GS coverage, reflecting MA’s lower coverage of non-journal publications. However, we shouldn’t forget that MA coverage for the Humanities still dwarfs coverage for this discipline in Scopus and WoS. It would be desirable for other researchers to verify our findings with different samples before drawing a definitive conclusion about MA coverage. However, based on our current findings we suggest that, only one year after its re-launch, MA is rapidly become the data source of choice; it appears to be combining the comprehensive coverage across disciplines, displayed by GS, with the more structured approach to data presentation, typical of Scopus and WoS. The Phoenix seems to be ready to leave the nest, all set to start its life into an adulthood of research evaluation.

Harzing, AW. & Alakangas, S. (2017) Microsoft Academic is one year old: the Phoenix is ready to leave the nest. Scientometrics 112: 1887. https://doi.org/10.1007/s11192-017-2454-3

Source: Microsoft Academic is one year old: the Phoenix is ready to leave the nest | Springer for Research & Development

Open Access: An Evaluation of its Impact, Obstacles, and Advancements

Author: Rachel A. Miles

Comments: A detailed article providing reviews on OA and Impact Metrics, and discussions on their misconceptions and misunderstandings. A review on OA mandates and policies is also provided. Other interesting discussions include those on Altmetrics, Eigenfactor, SNIP, JOI. An extensive list of potentially useful references are given.

Abstract: Access to research results is imperative in today’s robust digital age, yet access is often prevented by publisher paywalls. Open Access (OA) is the simple idea that all research should be free for all to access, use, and build upon. This paper will focus on three critical areas of the OA landscape: its impact on scholarship and the public, the obstacles to be overcome, and its advancements. The impact of OA actions and initiatives has been difficult to quantify, but the growing number of studies on OA have shown mostly overwhelmingly positive results. Cultural norms within academia, such as the reliance on the journal Impact Factor (IF) to assess the quality of individual research articles, have impeded the progress of OA. Conversely, federal mandates and institutional policies have supported the OA movement by requiring that scholarly publications be deposited into institutional or subject repositories immediately following publication. As information professionals, library and information science (LIS) professionals have a responsibility as practitioners, authors, and editors to support OA and encourage other academics to do the same.

Cite as: Miles, Rachel. (2016). Open Access: An Evaluation of its Impact, Obstacles, and Advancements. Bibliotekar, 58: (1-2).

Source: Open Access: An Evaulation of its Impact, Obstacles, and Advancements

Growth of hybrid open access, 2009–2016

Author: Bo-Christer Bjork

Notes: This 2017 article estimates the growth in hybrid OA journals and articles published within from 2009 to 2016. from 20 publishers Most interesting is the difficulty experienced in obtaining data because the hybridity of a journal is not always indicated. The author used previous studies and more recent data from 15 publishers who agreed to share, plus 5 big publishers. However data are not itemised for each publisher.

Abstract

Hybrid Open Access is an intermediate form of OA, where authors pay scholarly publishers to make articles freely accessible within journals, in which reading the content otherwise requires a subscription or pay-per-view. Major scholarly publishers have in recent years started providing the hybrid option for the vast majority of their journals. Since the uptake usually has been low per journal and scattered over thousands of journals, it has been very difficult to obtain an overview of how common hybrid articles are. This study, using the results of earlier studies as well as a variety of methods, measures the evolution of hybrid OA over time. The number of journals offering the hybrid option has increased from around 2,000 in 2009 to almost 10,000 in 2016. The number of individual articles has in the same period grown from an estimated 8,000 in 2009 to 45,000 in 2016. The growth in article numbers has clearly increased since 2014, after some major research funders in Europe started to introduce new centralized payment schemes for the article processing charges (APCs).

https://peerj.com/articles/3878/

Evaluation of Openness in the Activities of Research Organisations and Research Funding Organisations in 2016

Author: finland Ministry of Education and Culture, Open Science and Research Initiative

Notes: Interesting scoring of Finnish research organisations regarding progress towards openness using data retrieved from public websites. A follow up survey gave organisations the opportunity to correct and supplement data. Also compared with other European research organisations. Because of the rapidly changing landscape the assessment was not repeated in 2017.

Abstract: This evaluation of the openness of Finnish research performing and funding organisations was completed as part of the Open Science and Research Initiative (ATT) by the Ministry of Education and Culture. The target of this evaluation is to assess the openness of operational cultures in research organisations and research funding organisations. The key objectives, against which the assessments are made, are defined in the Open Science and Research Roadmap. More information about the evaluation can be found at openscience.fi/openculture

http://www.doria.fi/handle/10024/127273

Publications | Free Full-Text | Enhancing Institutional Publication Data Using Emergent Open Science Services | HTML

Authors: David Walters and Christopher Daley (Brunel University, London)

Notes: An interesting article looking at integrating data sources to assess OA status and location of OA copies for single UK university. Focusses on data derived from CORE and from Unpaywall and its combination with other information from university systems.

Abstract: The UK open access (OA) policy landscape simultaneously preferences Gold publishing models (Finch Report, RCUK, COAF) and Green OA through repository usage (HEFCE), creating the possibility of confusion and duplication of effort for academics and support staff. Alongside these policy developments, there has been an increase in open science services that aim to provide global data on OA. These services often exist separately to locally managed institutional systems for recording OA engagement and policy compliance. The aim of this study is to enhance Brunel University London’s local publication data using software which retrieves and processes information from the global open science services of Sherpa REF, CORE, and Unpaywall. The study draws on two classification schemes; a ‘best location’ hierarchy, which enables us to measure publishing trends and whether open access dissemination has taken place, and a relational ‘all locations’ dataset to examine whether individual publications appear across multiple OA dissemination models. Sherpa REF data is also used to indicate possible OA locations from serial policies. Our results find that there is an average of 4.767 permissible open access options available to the authors in our sample each time they publish and that Gold OA publications are replicated, on average, in 3 separate locations. A total of 40% of OA works in the sample are available in both Gold and Green locations. The study considers whether this tendency for duplication is a result of localised manual workflows which are necessarily focused on institutional compliance to meet the Research Excellence Framework 2021 requirements, and suggests that greater interoperability between OA systems and services would facilitate a more efficient transformation to open scholarship.

Source: Publications | Free Full-Text | Enhancing Institutional Publication Data Using Emergent Open Science Services | HTML

Over 80% of research outputs meet requirements of REF 2021 open access policy – Research England

Author: Research England (neé HEFCE)

Notes: An important national survey of progress towards Open Access in the context of a strong policy and compliance requirement. Interesting both for the claims it makes about the levels of OA as well as the language and nature of the process by which it is being achieved. Lots of important detail on how metadata is and is not being collected an processed.

Abstract: Sixty one per cent of research outputs known to be in scope for the REF 2021 are meeting open access deposit, discovery and access requirements, with a further twenty per cent reporting a known exception, a report published today shows.The report details the findings of a survey by the former Higher Education Funding Council for England (HEFCE), the Wellcome Trust, the former Research Councils UK (RCUK) and Jisc. The survey sought to assess how the sector is delivering funders’ open access (OA) policies and to understand some of the challenges the sector faces. The four project partners were also interested in understanding the methods and tools being used across the sector to ensure policy compliance.

Source: Over 80% of research outputs meet requirements of REF 2021 open access policy – Research England