University rankings: What do they really show? | SpringerLink

Author: Jill Johnes

Comment: This article highlights some issues with university rankings using correlation analysis and principal component analysis. Furthermore, it suggests an alternative weighting method that uses data envelopment analysis (DEA) that allows the weights to be optimised for individual institutions. DEA can further be used to derive performance groupings, rather than the usual point estimates. The ideas in this article are closely related to some of the discussions we  already had in the research team. Perhaps we may want to consider our data representation/aggregation along something like this, but with further specified min weight for each indicator?

Abstract: University rankings as developed by the media are used by many stakeholders in higher education: students looking for university places; academics looking for university jobs; university managers who need to maintain standing in the competitive arena of student recruitment; and governments who want to know that public funds spent on universities are delivering a world class higher education system. Media rankings deliberately draw attention to the performance of each university relative to all others, and as such they are undeniably simple to use and interpret. But one danger is that they are potentially open to manipulation and gaming because many of the measures underlying the rankings are under the control of the institutions themselves. This paper examines media rankings (constructed from an amalgamation of variables representing performance across numerous dimensions) to reveal the problems with using a composite index to reflect overall performance. It ends with a proposal for an alternative methodology which leads to groupings rather than point estimates.

Source: University rankings: What do they really show? | SpringerLink

The Inevitability of Open Access | Lewis | College & Research Libraries

Author: David W. Lewis

Comment: This is an interesting article to contrast results we see for gold OA. The article predicted, using business theory, that gold OA would reach 50% between 2017 and 2021, and 90% by 2025. This of course contradicts the saturation (if correct) that we see in gold OA (outside Latin America).

Abstract: Open access (OA) is an alternative business model for the publication of scholarly journals. It makes articles freely available to readers on the Internet and covers the costs associated with publication through means other than subscriptions. This article argues that Gold OA, where all of the articles of a journal are available at the time of publication, is a disruptive innovation as defined by business theorist Clayton Christensen. Using methods described by Christensen, we can predict the growth of Gold OA. This analysis suggests that Gold OA could account for 50 percent of the scholarly journal articles sometime between 2017 and 2021, and 90 percent of articles as soon as 2020 and more conservatively by 2025.

Source: The Inevitability of Open Access | Lewis | College & Research Libraries

Empirical analysis and classification of database errors in Scopus and Web of Science – ScienceDirect

Authors: Franceschini F, Maisano D & Mastrogiacomo L

Comment: This is an article studying various errors in Scopus and Web of Science (WoS) databases. These include citation indexing errors, missing links, missing DOIs, incorrect author names, etc. Manual check was done on a sample of errors. After classification of errors, it found that the distributions of errors were very different between Scopus and WoS.

Abstract: In the last decade, a growing number of studies focused on the qualitative/quantitative analysis of bibliometric-database errors. Most of these studies relied on the identification and (manual) examination of relatively limited samples of errors.

Using an automated procedure, we collected a large corpus of more than 10,000 errors in the two multidisciplinary databases Scopus and Web of Science (WoS), mainly including articles in the Engineering-Manufacturing field. Based on the manual examination of a portion (of about 10%) of these errors, this paper provides a preliminary analysis and classification, identifying similarities and differences between Scopus and WoS.

The analysis reveals interesting results, such as: (i) although Scopus seems more accurate than WoS, it tends to forget to index more papers, causing the loss of the relevant citations given/obtained, (ii) both databases have relatively serious problems in managing the so-called Online-First articles, and (iii) lack of correlation between databases, regarding the distribution of the errors in several error categories.

The description is supported by practical examples concerning a variety of errors in the Scopus and WoS databases.

Source: Empirical analysis and classification of database errors in Scopus and Web of Science – ScienceDirect

Dimensions: A competitor to Scopus and the Web of Science? – ScienceDirect

Author: Mike Thelwall

Comment: This articles compares samples of data from Scopus with Dimensions. It found that the citation counts in Dimensions are in line with those obtained in Scopus. The slightly lower numbers in citations give indication that the coverage of Dimensions may not be much greater than that of Scopus. Although, this is not explicitly checked.

Abstract: Dimensions is a partly free scholarly database launched by Digital Science in January 2018. Dimensions includes journal articles and citation counts, making it a potential new source of impact data. This article explores the value of Dimensions from an impact assessment perspective with an examination of Food Science research 2008–2018 and a random sample of 10,000 Scopus articles from 2012. The results include high correlations between citation counts from Scopus and Dimensions (0.96 by narrow field in 2012) as well as similar average counts. Almost all Scopus articles with DOIs were found in Dimensions (97% in 2012). Thus, the scholarly database component of Dimensions seems to be a plausible alternative to Scopus and the Web of Science for general citation analyses and for citation data in support of some types of research evaluations.

Source: Dimensions: A competitor to Scopus and the Web of Science? – ScienceDirect

Accuracy of affiliation information in Microsoft Academic: Implications for institutional level research evaluation

Authors: Ranjbar-Sahraei B.; Eck, N.J. van; Jong R. de

Comment: This is a summary of results for a poster presented at the STI 2018 Conference in Leiden. The work compares research output recorded by both Microsoft Academic (MA) and Web of Science (WoS) for Leiden University. A first level automated matching is done, revealing differences across MA and WoS. Then, a sample of 100 is drawn from each of the disagreeing parts of the comparison. Manual checking of these found that MA contained affiliation errors.

Abstract: In this work, we study the accuracy of affiliation information in Microsoft Academic (MA). To conduct this study, we have considered the full set of publications assigned to Leiden University (LU) as provided by two different data sources: MA and Web of Science (WoS). The results of this study suggest that a considerable number of publications in MA have missing or wrong affiliation information.

Source: Accuracy of affiliation information in Microsoft Academic: Implications for institutional level research evaluation

The prevalence of green and grey open access: Where do physical science researchers archive their publications? | SpringerLink

Authors: Li Zhang & Erin Watson

Comment: This paper focuses on comparing green and grey (archiving in academic social media or personal/departmental website) OA, for CIHR funded research. Data is extract from WoS and Google Scholar used to determine green and grey OA. The prevalence of grey OA is highlighted, and the low up-take of green OA is shown as not attributed to publisher policies, as most do allow green. The takeaway is suggestion to rethink about ways to archive OA, given the high costs of running an institutional repository.

Abstract: The Canadian Institute of Health Research (CIHR) implemented an open access policy for its grant recipients in 2008. We used bibliographic data from the Web of Science to find out how CIHR-funded researchers in the physical sciences self-archived their publications. We also examined the self-archiving policies of the journals in which the researchers published, and compared the citation rates of two different self-archiving approaches: the green open access route (deposit in an institutional or subject repository) and the grey open access route (deposit in an academic social network or personal/departmental website). Only 14% of the articles were openly accessible through the green open access route, while 37% could be accessed through the grey open access route. We cannot ascribe the low uptake of green open access to publishers’ self-archiving policies, as almost all journals allowed self-archiving through the green open access route. Authors deposited 31% of their publications in ResearchGate, the most popular self-archiving option in our study, while they deposited only 2.1% of their publications in institutional repositories, the least popular option. The citation rates of the various self-archiving approaches did not differ significantly. Our results suggest that it may be time to rethink how to achieve open access.

Source: The prevalence of green and grey open access: Where do physical science researchers archive their publications? | SpringerLink

Article processing charge (APC) for publishing open access articles: the Brazilian scenario | SpringerLink

Authors: Cleusa Pavan & Marcia C. Barbosa

Comment: This article provides a detailed literature on OA publication in Brazil and tracks the progress over time. This is splits among journals with different APC policies. Again, the caveat may be the sole use of Web of Science data. However, it is still interesting to note that, although the number of OA publications has increased (in absolute terms, rather than percentages), the output process/vehicle has also become endogenic locally. This is possibly driven by lack of funds. The authors suggest policies for funding APC are required to increase international publications.

Abstract: The article processing charge (APC) provides economic sustainability for scientific journals that publish in open access (OA). In this work, documents published in OA between 2012 and 2016 by authors with Brazilian affiliation are identified, the profile of these publications is analyzed and the cost of APC is estimated. In order to do so, data from 930 journals and 63,847 documents were collected from the Web of Science Core Collection. It was found that 59% of these documents were published in journals that charge APC. The total expenditures for the 5-year period were estimated at approximately USD 36 million, the weighted average cost per document at USD 957.75 and the average cost per journal at USD 1492.27. The profile of these publications shows that journals indexed by SciELO represent 67% of the 63,847 documents. The use of mega-journals increased over the period, which implies an increase in expenditure in publications, since the average APC per journal was USD 2059.77. It was observed that the OA Brazilian scientific production is characterized by an endogenic profile and has a preference for the Gold road with APC. These results suggest that policies for funding charges are required to stimulate a more international attitude.

Source: Article processing charge (APC) for publishing open access articles: the Brazilian scenario | SpringerLink

The History, Deployment, and Future of Institutional Repositories in Public Universities in South Africa – ScienceDirect

Author: Siviwe Bangani

Comment:
Another interesting paper about IRs in South Africa (SA). Web data was collected, together with interviews been conducted. A detailed history of IRs in SA is given. While many of the South African universities have signed various international declarations and initiative on OA, they often don’t have an institutional policy on OA. Various factors (obstacles and enablers) are listed. Amount of funding is relatively low compared to other countries. Varying IR sizes, types of objects in IRs, multiple language support and issues, and suggestions for development are presented and discussed.

Abstract:
This paper investigates the history, deployment, and content of institutional repositories (IRs) in public universities in South Africa. Some of the local, national and international drivers and enablers that ensure the establishment and survival of the institutional repositories are identified. Lastly, an attempt is made to determine the future of the IRs. Findings include that South African universities were among the first universities in the world to host IRs with the first IR established in 2000. The most prevalent and dominant content in South African public university collections are electronic theses and dissertations (ETDs). There are signs that this is changing as more libraries cover research outputs emanating from the universities. African languages are sparsely represented in IRs in South Africa. The majority of universities in the country signed the Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities, and the Budapest Open Access Initiative. Many of them do not have their own open access policy. The driving factors include the decline in government subsidy, increase in journal subscriptions, depreciation of the South African currency, and addition of the Value Added Tax (VAT) of 14% on electronic resources by the South Africa taxman while the enabling factors include the international open access mandates, the Carnegie Foundation grants, and the National Research Foundation’s statement on open access.

Bangani S (2018) The History, Deployment, and Future of Institutional Repositories in Public Universities in South Africa. The Journal of Academic Librarianship 44(1): 39-51.

Source: The History, Deployment, and Future of Institutional Repositories in Public Universities in South Africa – ScienceDirect

Institutional Repositories in Chinese Open Access Development: Status, Progress, and Challenges – ScienceDirect

Authors: Jing Zhong & Shuyong Jiang

Comment: An interesting paper interrogating institutional repositories (IR) in China. These IRs were accessed via ROAR, OpenDOAR, SouOA and CHAIR, though many URL links were broken. The article highlighted the slow development of OA repositories in China and attributed this to the lack of policy and support at all levels. At the end of the article, it mentioned that the Chinese Academy of Sciences and the National Natural Science Foundation of China, in May 2014, released an Open Access policy statement requiring that its funded research papers be made open access in IRs within 12 months after their publication. It would be interesting to follow-up on whether this had made any significant impact.

Abstract:
Open Access (OA) movement in China is developing with its own track and speed. Compared to its western counterparts, it moves slowly. However, it keeps growing. More significantly, it provides open and free resources not only to Chinese scholars, but also to those of China studies around the world. The premise is whether we can find them in an easy and effective fashion. This paper will describe the status of the OA movement in China with a focus on institutional repositories (IR) in Chinese universities and research institutes. We will explore different IR service modules and discuss their coverage, strengths, limitation, and most importantly implications to the East Asian Collection in the US.

Zhong J & Jiang S (2016) Institutional Repositories in Chinese Open Access Development: Status, Progress, and Challenges. The Journal of Academic Librarianship 42(6): 739-744.

Source: Institutional Repositories in Chinese Open Access Development: Status, Progress, and Challenges – ScienceDirect

Can Microsoft Academic help to assess the citation impact of academic books?

Authors: Kousha K & Thelwall M

Comment: This article examines the comparison of coverage and citations by Microsoft Academic (MA) with the Book Citation Index (BKCI) and Google Scholar (GS). It showed that, while MA’s coverage for books is still not comprehensive, it is able to find more citations in some fields than the other two sources. In particular, it has greater coverage than BKCI for some Arts & Humanities fields (though in general it is still biased towards the technical fields). MA also seems less sensitive to book editions. MA’s comparison with GS gave mixed results, with one better than the other in different fields, suggesting them as having partly complementary coverage.

Abstract: Despite recent evidence that Microsoft Academic is an extensive source of citation counts for journal articles, it is not known if the same is true for academic books. This paper fills this gap by comparing citations to 16,463 books from 2013-2016 in the Book Citation Index (BKCI) against automatically extracted citations from Microsoft Academic and Google Books in 17 fields. About 60% of the BKCI books had records in Microsoft Academic, varying by year and field. Citation counts from Microsoft Academic were 1.5 to 3.6 times higher than from BKCI in nine subject areas across all years for books indexed by both. Microsoft Academic found more citations than BKCI because it indexes more scholarly publications and combines citations to different editions and chapters. In contrast, BKCI only found more citations than Microsoft Academic for books in three fields from 2013-2014. Microsoft Academic also found more citations than Google Books in six fields for all years. Thus, Microsoft Academic may be a useful source for the impact assessment of books when comprehensive coverage is not essential.

Kousha K, Thelwall M (2018) Can Microsoft Academic help to assess the citation impact of academic books? arXiv.org: arXiv:1808.01474v1.

Source: Can Microsoft Academic help to assess the citation impact of academic books?