Elsevier journals — some facts

Author: Timothy Gower
Blogpost April 24, 2014

Comment: This long blog post discusses the author’s attempts, successful in many cases, to obtain the costs of Elsevier journal subscriptions at the UK Russell Group of universities. It includes some amusing detailed correspondence with JISC and the universities. Also  related discussion around APCs and their impact on subscription costs, Elsevier costs in some US universities, Brazil. Also in the post and related comments are some useful data sources and related analysis.

Introduction: A little over two years ago, the Cost of Knowledge boycott of Elsevier journals began. Initially, it seemed to be highly successful, with the number of signatories rapidly reaching 10,000 and including some very high-profile researchers, and Elsevier making a number of concessions, such as dropping support for the Research Works Act and making papers over four years old from several mathematics journals freely available online. It has also contributed to an increased awareness of the issues related to high journal prices and the locking up of articles behind paywalls….

I  have come to the conclusion that if it is not possible to bring about a rapid change to the current system, then the next best thing to do, which has the advantage of being a lot easier, is to obtain as much information as possible about it. Part of the problem with trying to explain what is wrong with the system is that there are many highly relevant factual questions to which we do not yet have reliable answers.

Elsevier journals — some facts

Can Microsoft Academic help to assess the citation impact of academic books?

Authors: Kousha K & Thelwall M

Comment: This article examines the comparison of coverage and citations by Microsoft Academic (MA) with the Book Citation Index (BKCI) and Google Scholar (GS). It showed that, while MA’s coverage for books is still not comprehensive, it is able to find more citations in some fields than the other two sources. In particular, it has greater coverage than BKCI for some Arts & Humanities fields (though in general it is still biased towards the technical fields). MA also seems less sensitive to book editions. MA’s comparison with GS gave mixed results, with one better than the other in different fields, suggesting them as having partly complementary coverage.

Abstract: Despite recent evidence that Microsoft Academic is an extensive source of citation counts for journal articles, it is not known if the same is true for academic books. This paper fills this gap by comparing citations to 16,463 books from 2013-2016 in the Book Citation Index (BKCI) against automatically extracted citations from Microsoft Academic and Google Books in 17 fields. About 60% of the BKCI books had records in Microsoft Academic, varying by year and field. Citation counts from Microsoft Academic were 1.5 to 3.6 times higher than from BKCI in nine subject areas across all years for books indexed by both. Microsoft Academic found more citations than BKCI because it indexes more scholarly publications and combines citations to different editions and chapters. In contrast, BKCI only found more citations than Microsoft Academic for books in three fields from 2013-2014. Microsoft Academic also found more citations than Google Books in six fields for all years. Thus, Microsoft Academic may be a useful source for the impact assessment of books when comprehensive coverage is not essential.

Kousha K, Thelwall M (2018) Can Microsoft Academic help to assess the citation impact of academic books? arXiv.org: arXiv:1808.01474v1.

Source: Can Microsoft Academic help to assess the citation impact of academic books?

Citation analysis with microsoft academic

Authors: Hug SE, Ochsner M & Brandle MP

Comment: This article compares the citation analyses between Microsoft Academic (MA) and Scopus. This was compared via the output of three selected researchers. The results showed uniformity across MA and Scopus. Some limitations to MA were also pointed out.

Abstract: We explore if and how Microsoft Academic (MA) could be used for bibliometric analyses. First, we examine the Academic Knowledge API (AK API), an interface to access MA data, and compare it to Google Scholar (GS). Second, we perform a comparative citation analysis of researchers by normalizing data from MA and Scopus. We find that MA offers structured and rich metadata, which facilitates data retrieval, handling and processing. In addition, the AK API allows retrieving frequency distributions of citations. We consider these features to be a major advantage of MA over GS. However, we identify four main limitations regarding the available metadata. First, MA does not provide the document type of a publication. Second, the “fields of study” are dynamic, too specific and field hierarchies are incoherent. Third, some publications are assigned to incorrect years. Fourth, the metadata of some publications did not include all authors. Nevertheless, we show that an average-based indicator (i.e. the journal normalized citation score; JNCS) as well as a distribution-based indicator (i.e. percentile rank classes; PR classes) can be calculated with relative ease using MA. Hence, normalization of citation counts is feasible with MA. The citation analyses in MA and Scopus yield uniform results. The JNCS and the PR classes are similar in both databases, and, as a consequence, the evaluation of the researchers’ publication impact is congruent in MA and Scopus. Given the fast development in the last year, we postulate that MA has the potential to be used for full-fledged bibliometric analyses.

Hug, S.E., Ochsner, M. & Brändle, M.P. (2017) Citation analysis with microsoft academic. Scientometrics 111: 371. https://doi.org/10.1007/s11192-017-2247-8

Source: Citation analysis with microsoft academic

Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

Authors: Tsay M-y, Wu T-l & Tseng L-l

Comment: This article compares several open access search engines (i.e., Google Scholar (GS), Microsoft Academic (MSA), OAIster, OpenDOAR, arXiv.org and Astrophysics Data System (ADS)) using publications of Nobel Laureates for Physics from 2001 to 2013. A short literature on comparing search engines is given. Both internal and external overlaps are studied. At the time of this work, GS had the highest coverage of this sample, but had a very high percentage of internal overlap (>92%). It actually covers all items in other sources, except for MSA. ADS and MSA both had coverage just below GS, with ADS having the lowest internal overlap of the three (just slightly higher than arXiv.org, which had 0 internal overlap).

Abstract: This study examines the completeness and overlap of coverage in physics of six open access scholarly communication systems, including two search engines (Google Scholar and Microsoft Academic), two aggregate institutional repositories (OAIster and OpenDOAR), and two physics-related open sources (arXiv.org and Astrophysics Data System). The 2001–2013 Nobel Laureates in Physics served as the sample. Bibliographic records of their publications were retrieved and downloaded from each system, and a computer program was developed to perform the analytical tasks of sorting, comparison, elimination, aggregation and statistical calculations. Quantitative analyses and cross-referencing were performed to determine the completeness and overlap of the system coverage of the six open access systems. The results may enable scholars to select an appropriate open access system as an efficient scholarly communication channel, and academic institutions may build institutional repositories or independently create citation index systems in the future. Suggestions on indicators and tools for academic assessment are presented based on the comprehensiveness assessment of each system.

Tsay M-y, Wu T-l, Tseng L-l (2017) Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources. PLoS ONE 12(12): e0189751. https://doi.org/10.1371/journal.pone.0189751

Source: Completeness and overlap in open access systems: Search engines, aggregate institutional repositories and physics-related open sources

Microsoft Academic is one year old: the Phoenix is ready to leave the nest

Authors: Harzing, AW. & Alakangas, S.

Comment: This is the third of a series of articles, by the first author, investigating the relative citation and publication coverage of Microsoft Academic (MA) within its first year of (re-)launch. Although the studies were of relatively small scale (citation record of 1 and 145 academics), they provided strong evidence for the advantages of MA over other databases. In particular, it possesses high coverage like Google Scholar and, at the same time, structured metadata like in Scopus and Web of Science. These, together with its fast growth, make MA an excellent alternative for bibliometrics and scientometrics studies.

Abstract: We investigate the coverage of Microsoft Academic (MA) just over a year after its re-launch. First, we provide a detailed comparison for the first author’s record across the four major data sources: Google Scholar (GS), MA, Scopus and Web of Science (WoS) and show that for the most important academic publications, journal articles and books, GS and MA display very similar publication and citation coverage, leaving both Scopus and WoS far behind, especially in terms of citation counts. A second, large scale, comparison for 145 academics across the five main disciplinary areas confirms that citation coverage for GS and MA is quite similar for four of the five disciplines. MA citation coverage in the Humanities is still substantially lower than GS coverage, reflecting MA’s lower coverage of non-journal publications. However, we shouldn’t forget that MA coverage for the Humanities still dwarfs coverage for this discipline in Scopus and WoS. It would be desirable for other researchers to verify our findings with different samples before drawing a definitive conclusion about MA coverage. However, based on our current findings we suggest that, only one year after its re-launch, MA is rapidly become the data source of choice; it appears to be combining the comprehensive coverage across disciplines, displayed by GS, with the more structured approach to data presentation, typical of Scopus and WoS. The Phoenix seems to be ready to leave the nest, all set to start its life into an adulthood of research evaluation.

Harzing, AW. & Alakangas, S. (2017) Microsoft Academic is one year old: the Phoenix is ready to leave the nest. Scientometrics 112: 1887. https://doi.org/10.1007/s11192-017-2454-3

Source: Microsoft Academic is one year old: the Phoenix is ready to leave the nest | Springer for Research & Development

Open Educational Resources and Rhetorical Paradox in the Neoliberal Univers(ity) | Journal of Critical Library and Information Studies

comments

A critique of the aims/achievements of OER in the context of openness and inclusion, equity debates. The author argues that OER is still part of and replicates neo-liberal educations systems and is not yet disruptive. This may be possible if OER can be developed at local levels involving students and educators and those who do not currently have an opportunity to produce knowledge for educational purposes.


Author: Nora Almeida,
New York City College of Technology, CUNY

Keywords: Open Educational Resources, Social Justice, Neoliberalism, Pedagogy, Information Access, Digital Education

Abstract

As a phenomenon and a quandary, openness has provoked conversations about inequities within higher education systems, particularly in regards to information access, social inclusion, and pedagogical practice. But whether or not open education can address these inequities, and to what effect, depends on what we mean by “open” and specifically, whether openness reflexively acknowledges the fraught political, economic, and ethical dimensions of higher education and of knowledge production processes. This essay explores the ideological and rhetorical underpinnings of the open educational resource (OER) movement in the context of the neoliberal university. This essay also addresses the conflation of value and values in higher education – particularly how OER production processes and scholarship labor are valued. Lastly, this essay explores whether OER initiatives provide an opportunity to reimagine pedagogical practices, to reconsider authority paradigms, and potentially, to dismantle and redress exclusionary educational practices in and outside of the classroom. Through a critique of neoliberalism as critically limiting, an exploration of autonomy, and a refutation of the precept that OER can magically solve social inequalities in higher education, the author ultimately advocates for a reconsideration of OER in context and argues that educators should prioritize conversations about what openness means within their local educational communities.

Author Biography

Nora Almeida, New York City College of Technology, CUNY

Source: Open Educational Resources and Rhetorical Paradox in the Neoliberal Univers(ity) | Journal of Critical Library and Information Studies

Open Access: An Evaluation of its Impact, Obstacles, and Advancements

Author: Rachel A. Miles

Comments: A detailed article providing reviews on OA and Impact Metrics, and discussions on their misconceptions and misunderstandings. A review on OA mandates and policies is also provided. Other interesting discussions include those on Altmetrics, Eigenfactor, SNIP, JOI. An extensive list of potentially useful references are given.

Abstract: Access to research results is imperative in today’s robust digital age, yet access is often prevented by publisher paywalls. Open Access (OA) is the simple idea that all research should be free for all to access, use, and build upon. This paper will focus on three critical areas of the OA landscape: its impact on scholarship and the public, the obstacles to be overcome, and its advancements. The impact of OA actions and initiatives has been difficult to quantify, but the growing number of studies on OA have shown mostly overwhelmingly positive results. Cultural norms within academia, such as the reliance on the journal Impact Factor (IF) to assess the quality of individual research articles, have impeded the progress of OA. Conversely, federal mandates and institutional policies have supported the OA movement by requiring that scholarly publications be deposited into institutional or subject repositories immediately following publication. As information professionals, library and information science (LIS) professionals have a responsibility as practitioners, authors, and editors to support OA and encourage other academics to do the same.

Cite as: Miles, Rachel. (2016). Open Access: An Evaluation of its Impact, Obstacles, and Advancements. Bibliotekar, 58: (1-2).

Source: Open Access: An Evaulation of its Impact, Obstacles, and Advancements

A new methodology for comparing Google Scholar and Scopus

Authors: Henk F.Moed, Judit Bar-Ilan & Gali Halevi

Comments: This article is a small sample case study comparing meta data from Google Scholar and Scopus. Although the study only covers 36 articles in 12 journals (and the resulting ~7000 citations), it proposes some interesting methodologies. In particular, the methods for dealing with match-merging, citation duplicates and indexing speed may be of interest.

Abstract: A new methodology is proposed for comparing Google Scholar (GS) with other citation indexes. It focuses on the coverage and citation impact of sources, indexing speed, and data quality, including the effect of duplicate citation counts. The method compares GS with Elsevier’s Scopus, and is applied to a limited set of articles published in 12 journals from six subject fields, so that its findings cannot be generalized to all journals or fields. The study is exploratory, and hypothesis generating rather than hypothesis-testing. It confirms findings on source coverage and citation impact obtained in earlier studies. The ratio of GS over Scopus citation varies across subject fields between 1.0 and 4.0, while Open Access journals in the sample show higher ratios than their non-OA counterparts. The linear correlation between GS and Scopus citation counts at the article level is high: Pearson’s R is in the range of 0.8–0.9. A median Scopus indexing delay of two months compared to GS is largely though not exclusively due to missing cited references in articles in press in Scopus. The effect of double citation counts in GS due to multiple citations with identical or substantially similar meta-data occurs in less than 2% of cases. Pros and cons of article-based and what is termed as concept-based citation indexes are discussed.

Cite as: Moed HF, Bar-Ilan J & Halevi G (2016) A new methodology for comparing Google Scholar and Scopus. Journal of Informetrics 10(2): 533-551.

Source: A new methodology for comparing Google Scholar and Scopus

On impact factors and university rankings: from birth to boycott

Authors: Konstantinos I. Stergiou & Stephan Lessenich

Comments: This is a short article giving a quick literature review and a summary of criticisms on impact factors and university rankings.

Abstract: In this essay we explore parallels in the birth, evolution and final ‘banning’ of journal impact factors (IFs) and university rankings (URs). IFs and what has become popularized as global URs (GURs) were born in 1975 and 2003, respectively, and the obsession with both ‘tools’ has gone global. They have become important instruments for a diverse range of academic and higher education issues (IFs: e.g. for hiring and promoting faculty, giving and denying faculty tenure, distributing research funding, or administering institutional evaluations; URs: e.g. for reforming university/department curricula, faculty recruitment, promotion and wages, funding, student admissions and tuition fees). As a result, both IFs and GURs are being heavily advertised—IFs in publishers’ webpages and GURs in the media as soon as they are released. However, both IFs and GURs have been heavily criticized by the scientific community in recent years. As a result, IFs (which, while originally intended to evaluate journals, were later misapplied in the evaluation of scientific performance) were recently ‘banned’ by different academic stakeholders for use in ‘evaluations’ of individual scientists, individual articles, hiring/promotion and funding proposals. Similarly, URs and GURs have also led to many boycotts throughout the world, probably the most recent being the boycott of the German ‘Centrum fuer Hochschulentwicklung’ (CHE) rankings by German sociologists. Maybe (and hopefully), the recent banning of IFs and URs/GURs are the first steps in a process of academic self-reflection leading to the insight that higher education must urgently take control of its own metrics.

Cite As: Stergiou KI & Lessenich S (2014) On impact factors and university rankings: from birth to boycott. Ethics Sci Environ Polit 13:101-111.

Source: Inter Research » ESEP » v13 » n2 » p101-111

The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles [PeerJ]

Authors: ​, , , , , , , ,

Notes: Study from Unpaywall founders to identify OA status in three samples of articles. While estimating 28% (to 2015) of the scholarly literature is OA and growing, they also identify the complexities of the OA scholarly publishing landscape, finding a large proportion in the category of Bronze, articles free/open to read but lacking reuse or license data.

Abstract

Despite growing interest in Open Access (OA) to scholarly literature, there is an unmet need for large-scale, up-to-date, and reproducible studies assessing the prevalence and characteristics of OA. We address this need using oaDOI, an open online service that determines OA status for 67 million articles. We use three samples, each of 100,000 articles, to investigate OA in three populations: (1) all journal articles assigned a Crossref DOI, (2) recent journal articles indexed in Web of Science, and (3) articles viewed by users of Unpaywall, an open-source browser extension that lets users find OA articles using oaDOI. We estimate that at least 28% of the scholarly literature is OA (19M in total) and that this proportion is growing, driven particularly by growth in Gold and Hybrid. The most recent year analyzed (2015) also has the highest percentage of OA (45%). Because of this growth, and the fact that readers disproportionately access newer articles, we find that Unpaywall users encounter OA quite frequently: 47% of articles they view are OA. Notably, the most common mechanism for OA is not Gold, Green, or Hybrid OA, but rather an under-discussed category we dub Bronze: articles made free-to-read on the publisher website, without an explicit Open license. We also examine the citation impact of OA articles, corroborating the so-called open-access citation advantage: accounting for age and discipline, OA articles receive 18% more citations than average, an effect driven primarily by Green and Hybrid OA. We encourage further research using the free oaDOI service, as a way to inform OA policy and practice.

https://peerj.com/articles/4375/

Source: The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles [PeerJ]