SciELO - Scientific Electronic Library Online

 
vol.84 issue2 author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


South African Journal of Libraries and Information Science

On-line version ISSN 2304-8263
Print version ISSN 0256-8861

SAJLIS vol.84 n.2 Pretoria  2018

http://dx.doi.org/10.7553/84-2-17621 

RESEARCH ARTICLES

 

Navigating the rising metrics tide in the 21st century: Which way for academic librarians in support of researchers in sub-Saharan Africa?

 

 

Omwoyo Bosire Onyancha

Research Professor in the Department of Information Science, University of South Africaonyanob@unisa.ac.za. ORCID: orcid.org/0000-0002-9232-4939

 

 


ABSTRACT

Academic librarians in sub-Saharan Africa, just like their counterparts in the rest of the world, are under pressure to provide relevant information services to satisfy varied client needs. Research administrators, grant offices, researchers, students and university managers, among others, are increasingly turning to academic librarians for information to meet their myriad research-related information and metrics needs. Sub-Saharan African countries' emphasis on strengthening science, technology and innovation has exerted further pressure on information professionals and other stakeholders to develop and appropriately apply bibliometrics and altmetrics to inform decision-making and policy-formulation processes. Research output and impact metrics are not only quickly evolving but their volume has been unprecedented. This paper's focus is the application of bibliometrics and altmetrics by academic librarians to support researchers in their respective universities. The paper argues that there is need for up-skilling or re-skilling the 21st-century sub-Saharan African academic librarian to select and apply appropriate metrics, tools and strategies to support researchers as they execute their responsibilities. The paper offers possible areas of training for academic librarians.

Keywords: Research metrics, academic librarians, sub-Saharan Africa, bibliometrics, altmetrics, research support


 

 

1 Introduction and contextual setting

Why I think that the ordinary librarian should remain involved in the bibliometrics scene: if we can understand bibliometric measures and significant developments in the field then not only will we be able to pass knowledge on to our user community, but it is also a sign that such measures can be understood by all academics who might need to understand them (Delasalle 2017).

The work of academic librarians has evolved over time from being merely that of custodians of books to being active participants in user-centred services (Rice-Lively & Racine 1997; Deegan & Tanner 2002; Olivia 2007; Borgman 2010; Zenke 2012). The changing roles can be attributed to the transformative nature of universities (Rice-Lively & Racine 1997: 32) and/or the emergence of information and communication technologies (ICTs) (Melchionda 2007; Goetsch 2008; Obegi & Nyamboga 2011). Klobas predicted in 1999 that, in the electronic service context, academic librarians would become more prominent as educators, information managers, information management consultants, custodians of information, information providers and publishers, and change agents (Klobas 1999). Of late, academic librarians' support services towards research, on the one hand, and researchers, on the other hand, have taken centre stage. We have witnessed an increased engagement of individuals with knowledge in bibliometrics in supporting research in academic libraries through, for example, provision of information on research impact and on journals in which authors can publish, and provision of user education programs, workshops and seminars on a variety of topics related to bibliometrics. Job advertisements that seek individuals with bibliometrics knowledge, skills and competencies have become common. A simple internet search for job opportunities for individuals with knowledge in bibliometrics yielded the results indicated in Table 1. It is illustrative that most of the job opportunities in bibliometrics are placed within the library. The required responsibilities for the above-mentioned jobs can be summarised as follows:

providing outreach and instructional services related to data management, data literacy, bibliometrics (for example, altmetrics, journal metrics) and other scholarly communication topics;

performing advanced bibliometric and scientometric analyses to provide researchers and administrators with an in-depth understanding of scholarly productivity and emerging indicators of research impact;

collecting, maintaining and analysing in-depth editorial and bibliometric data to report on journal and portfolio performance, collaborating with cross-functional teams as needed; creating and maintaining databases to produce reports on editorial performance; and

performing bibliometric and citation analysis for divisions and labs to help them identify research strengths, benchmark performance, inform research strategy development, locate potential collaborators, prioritise publications to read and demonstrate scholarly impact.

The bibliometrics-based education or qualification required include:

familiarity with bibliometric tools;

proficiency with bibliographic and citation databases;

demonstration of intermediate to advanced proficiency in Microsoft Excel or other data analysis and visualisation tools;

knowledge of the scholarly publishing landscape;

experience with Endnote, ProCite, Zotero or similar bibliographic database; and

knowledge and experience in research evaluation and unique author identifier systems.

Appendix A provides additional responsibilities and educational qualifications.

 

2 Researchers' information and metrics needs in the 21st century

Before we explore the areas of bibliometrics and altmetrics support for researchers, let us examine the researchers' needs in the 21st century. It is widely acknowledged that the changing information needs of users (Kebede 2002; Abubakar 2011), too, have been partly instrumental in the changing role of academic librarians (see Mamtora 2013). In a study conducted by Wolff (2016), almost three quarters of university staff in selected universities in Canada believed that the primary responsibility of their libraries was to facilitate access to scholarly materials. Wolff (2016) further found that the university staff in the affected universities considered the library an important resource in their pursuit to improve their scholarly output and, even, their impact.

A survey investigation into how to bridge the librarian-faculty gap in academic libraries found that 84% of researchers reported that "supporting faculty research" was an essential service that libraries should provide to students and faculty members (Library Journal and Gale CENGAGE Learning 2015). In contrast, only 79% of the librarians felt that "supporting faculty research" was an essential service. Whether this finding implies that academic librarians do not understand the research needs of the faculty, or simply that the librarians are more concerned with supporting students than they are ready to support faculty members, is a subject that requires further research. Nevertheless, it is promising to note that most librarians viewed supporting faculty research as one of the essential services, although it was not the priority.

Besides the need for scholarly materials, library users, including researchers, need central locations (libraries) to "connect with and learn from one another, create and remix, display and discuss their work, and capture and preserve community knowledge" (Andrews et al. 2016: 145). It follows, therefore, that researchers would need to have a library that provides an environment and support services geared towards improving their research activities, including support regarding impact metrics. It has been observed that academics have increasingly become aware of the need to demonstrate wider impacts of their research (Corrall, Kennan & Afzal 2013; Thelwall 2014; Ravenscroft et al. 2017). Hanard (2001), Hanard, Carr and Brody (2001), Beamish (2006) and Alzahrani (2010: 22) have stated that authors of research articles write and publish for "research impact".

It is safe to conclude therefore that researchers are interested in, not only the means of maximising their research impact, but also the metrics that proxy research impact. Consequently, researchers want to know, for example, the journal in which to publish their research for maximum impact. According to Housewright, Schonfeld and Wulfson (2013: 71), researchers prefer journals that are circulated widely and are well read by academics in their fields; journals covering areas immediate to their areas of research; journals with high impact factors; journals that permit academics to publish articles for free, without paying page or article charges; and journals that publish accepted manuscripts with relatively little delay. Mamtora (2013) indicated that new researchers place high regard on keeping up with their research topic, setting up alerts in databases, finding journals and uploading their research outputs into the institutional repository. On their part, established researchers' needs largely revolve around keeping up with their research topic, improving skills in finding journals in which to publish, as well as identifying open access journals. Mamtora (2013) further added that contemporary researchers need to be research-information literate and have a good understanding of the research life cycle and good access to information resources. Auckland (2012: 13) opined that researchers' needs differ in "relation to their discipline and/or subject and its culture and praxis, and the stage of their career".

Researchers' eagerness to demonstrate wider impact of their research means that their needs will extend to include knowledge of which tools to use to maximise research visibility; various types of metrics that are used to assess output and impact; sources of bibliometrics/scientometrics data; and how to read, interpret and use different metrics. This brings us to the next topic of discussion, namely metrics support.

 

3 Why academic librarians?

Delasalle (2011: 15) has argued that "librarians have skills and experience in evaluating the quality of information resources, and these are very relevant to researchers whose work is to be evaluated, as well as to those proposing to evaluate research". For their part, Gumpenberger, Wieland and Gorraiz (2012: 175) have outlined several reasons to explain why bibliometrics is an ideal field of activity for modern academic librarians, as follows:

  • Librarians know how to use the major databases efficiently and have access to many of these and their imposed analytical tools.

  • Librarians have experience in data gathering and cleaning as well as in coding and categorising diverse types of documents and, as such, they are able to extract meaningful information for further interpretation.

  • Librarians belong to independent and interdisciplinary institutions. In this position, they provide central services tailored to scientists, research managers and science policy makers.

  • Librarians do not only have the opportunity to create and implement a wealth of new services but can also contribute to the global discipline-specific scientific discourse by participating in projects and collaborations by attending and organising conferences and by actively publishing relevant findings.

Furthermore, bibliometrics is a subfield within the broad field of library and information science (Gumpenberger, Wieland and Gorraiz 2012), thus fitting appropriately within duties and responsibilities of the academic librarian. Roemer and Borchardt (2015b: 31) have added that librarians are already familiar with providing support for bibliometrics tools such as Web of Science citation indexes. As a result, librarians "serve as natural leaders when it comes to altmetrics, not only due to familiarity with resources, but also because of the relationships they maintain with several disparate groups ... librarians serve as a neutral voice and advocate on behalf of the needs of their community" (Roemer & Borchardt 2 015b: 31).

 

4 Bibliometrics and informetrics in academic libraries in sub-Saharan Africa

What is the status of bibliometrics and informetrics in academic libraries in sub-Saharan Africa? This is an area that requires further research, but we can safely say that there have been attempts on the part of libraries and librarians in developed countries to embrace bibliometrics and altmetrics to support researchers in universities (Corrall, Kennan & Afzal 2013). In developing countries, however, there is limited activity in bibliometrics and altmetrics. One of the fundamental steps that academic libraries in the developed world have taken is the development of online instructional materials (simply called libguides) on measuring impact through bibliometrics and altmetrics approaches (Roemer & Borchardt 2015a). Some academic libraries in sub-Saharan Africa have followed suit and are increasingly becoming engaged in bibliometrics activities (for example, the provision of workshops on the topic). Examples of these academic libraries include the University of Cape Town (UCT) Libraries, which conducts a series of workshops under the banner 'Savvy Researcher' where researchers and students are taken through various resources that will assist them in deciding where to publish, in finding the most important journals in a field and the most important papers on a topic, and in learning about the impact of a paper, researcher or department. The libraries at the University of Zululand, University of Pretoria and Stellenbosch University are among several libraries in South Africa that are offering some training on bibliometrics for their information users. A quick scan of the websites of library schools in sub-Saharan Africa reveals that bibliometrics is non-existent in most formal curricula taught in library and information science (LIS) schools in the region and therefore most practising librarians in the region possess limited or no knowledge of informetrics/bibliometrics. Yet, librarians are often called upon to apply these methods and techniques to support researchers and other stakeholders.

So, which way for academic librarians in their pursuit of supporting researchers, research administrators, university administration and other parties interested in research metrics to make informed decisions at universities in sub-Saharan Africa? It is my conviction that academic librarians are required to possess knowledge, skills and expertise in the following areas:

different metrics used to evaluate and measure research output and impact;

sources of bibliometrics, webometrics and altmetrics data; and

support services for researchers in the areas of bibliometrics, webometrics and altmetrics. A brief explanation and description of each of the abovementioned areas is provided below.

4.1 Metrics used to evaluate and measure research output and impact

Wilsdon and Belfiore (2015: iii) have noted that the metric tide is rising fast and rushing through higher education and research worldwide. A research evaluation metric can be defined as a measuring system that quantifies research output, trends, dynamics or characteristics (Schouten 2010). A research metric is sometimes used synonymously with a research indicator, although there are distinct differences. It is worth mentioning that these metrics are many and vary depending on the individuals or institutions with which they originated.

Research metrics can be categorised depending on the level at which they are applied (for example, article-, author-, journal-, institutional- or country-level) or format of the metrics (for example, citation-, publication-, web- or social media-based metrics). Andras (2011) outlined research metrics as constituting: a) publication count metrics; b) citation count metrics; c) metrics derived from citation and authorship graphs; and d) market share metrics. Metrics that are often employed to measure scholarly output include the following:

the number of journal articles, conference papers, books and patents a particular author, group of authors, institution and/or country/geographic region produced over a given period;

the number of publications/documents and patents produced on, for example, a given topical issue, in a discipline, or by a country/geographic area;

papers co-authored by a group of authors;

domestic, regional and international collaboration activity;

publications and document types published in a journal or in conference proceedings;

the number of languages in which documents are published;

the frequency of word occurrence and/or co-occurrence in a text;

the number of documents published per subject area;

the relative productivity of a region compared to the rest of the world;

the outputs in top percentiles; and

the length of publications in terms of pages or number of words.

Citation-based metrics that are often used to measure research impact (hence the phrase 'impact metrics') are perhaps the most researched and emotive. There are varieties and variants of impact metrics, which are applied at different levels:

journals only (for example, journal impact factor, journal immediacy index, cited half-life, citing half-life, SCImago journal rank);

author only (for example, RG score, h-index, g-index, i10-index); some or all of these metrics are applied to analyse the impact of other units such as journals and institutions;

article level only (for example, article influence score);

all units (for example, citation counts, self-citations count, cited/uncited papers ratio, cites per item, Eigenfactor score, field-weighted cited impact, Scopus SNIP)

In 2010, Priem, Groth and Taraborelli coined the name "altmetrics", which introduced additional (alternative) metrics into the assessment of scholarly communication and research (see Priem, Groth & Taraborelli 2012). The alternative metrics (altmetrics) are defined as the "study and use of scholarly impact measures based on activity in online tools and environments" (Priem, Groth and Taraborelli 2012: 1). Thelwall (2014), in his post on the London School of Economics (LSE) blog, provided what he calls a "partial list of alternative metrics that have been proposed for academic articles or books". The metrics include tweet citations, Facebook post citations, blog citations, general web citations, grey literature citations, online syllabus citations, online presentation syllabuses, discussion forum citations, mainstream media citations, Mendeley, CiteULike or Zotero bookmark counts, library holdings (of books) and citations from Google Books. Altmetrics are categorised into usage (such as, HTML views, PDF and XML downloads, frequency of data access); captures (such as, bookmarks in CiteULike and Mendeley, shares in Mendeley readers); mentions (such as, in blog posts, news stories, Wikipedia articles, comments and reviews); social media (such as, tweets, Facebook posts and likes, shares in LinkedIn); and citations (such as, article citations in online platforms Scopus, CrossRef and PubMed Central). For their part, Hoffman and Fodor (2010) as cited in Holmberg (2016) categorised altmetrics into those indicating awareness (for example, unique visits, views, followers, reviews), engagement (for example, number of members, comments, replies, likes, subscribers, active users), and word-of-mouth (how consumers communicate among themselves through, for example, number of retweets, inlinks and bookmarks).

The webometrics and cybermetrics indicators are many. They include number of citations from top authors and number of papers among the top 10% most-cited papers (see Ingwersen 1998; Thelwall 2004; Noruzi 2006; Thelwall, 2009; Thelwall, 2010). The Webometrics Ranking of World Universities (WRWU) takes into consideration four aspects in its pursuit to assess the performance of each of the universities on the web. Two aspects are specific to research, namely transparency/openness (number of citations from top authors) and excellence (number of papers among the top 10% most cited in twenty-six disciplines using SCImago Journal and Country Rank). Knowledge of these metrics and how various ranking systems operate is essential for academic librarians to offer the necessary support for them. For example, the WRWU places the most emphasis on the visibility/impact ranking of universities (50% weighting) whereby external networks are used as proxies; 30% weight is placed on excellence/scholarship. It follows therefore that universities doing well in terms of external networks as well as having more presence in Scopus will perform much better than universities that are more highly ranked in terms of Google presence (transparency/openness). It should, however, be noted that different global ranking systems use different weights to rank universities.

4.2 Sources of bibliometrics, webometrics and altmetrics data

The second aspect with which academic librarians are called upon to familiarise themselves is the sources of bibliometrics data. Since the official launch of the Science Citation Index (SCI) in 1964, the number of sources of bibliometrics data has proliferated and culminated in the development of Web 2.0-based platforms that provide altmetrics data. The then-Institute for Scientific Information (ISI), later called Thomson Reuters and now Clarivate Analytics, provided the scientific community with three citation indexes, namely the Science Citation Index (SCI), Social Sciences Citation Index (SSCI) and the Arts and Humanities Citation Index (AHCI). Later, Web of Science (WoS) introduced two citation indexes, one for conference proceedings and one for books. Currently, WoS provides a platform to access a variety of other sources for bibliometrics data. Thomson Reuters' Journal Citation Reports (JCR) provide metrics associated with the citation performance of journals indexed in Clarivate Analytics' citation indexes. Citation tracking can also be performed using Elsevier's Scopus, Google Scholar, Publish or Perish (which draws its data from Google Scholar), EBSCO's citing art icles, IEEE Xplore, ACM Digital Library, JSTOR, SciFinder, CiteSeer, MathSciNet, CiteULike, Indian Citation Index, Chinese Citation Database (CCD) and SCImago. The most commonly-used citation data sources worldwide, however, are WoS, Scopus and Google Scholar. WoS and Scopus require paid subscription and are therefore not available in most libraries in sub-Saharan Africa. Most bibliographic databases (such as, EBSCO-hosted databases) provide data that can be used to measure research output. Institutional repositories may also be used to showcase research outputs of individual authors, institutions and even countries.

One development that may interest academic librarians in sub-Saharan Africa concerns the establishment of the African Citation Index (ACI), produced by the Council for the Development of Social Science Research in Africa (CODESRIA). This source is likely to provide a paradigm shift in the measurement of research output and impact in Africa due to its specific focus on research produced on the continent. The ACI's mission is "to be a worldwide authoritative and objective source for gauging research evidence in Africa" while its vision is "to organise and monitor research performance and production in Africa". CODESRIA envisages that the ACI will enable stakeholders such as research funders, academic institutions, government offices, researchers and bibliometrics scholars, among others, to do the following:

monitor the use and users, sources and characteristics of African scientific publications;

establish the visibility and productivity of scientists, disciplines, institutions and countries based on high-quality content;

know the core competencies and paradigmatic shifts of knowledge in Africa;

understand the pattern of knowledge exchange and balance of intellectual influence in Africa and internationally;

map scholarly areas and fields to provide information about relevance and spread of research endeavours; and

study the evolution of research/science (CODESRIA 2016).

In terms of altmetrics, several sources of data have emerged since the coining of the term in 2010. Roemer and Borchardt (2015c) provide a list of the major altmetrics tools. Non-academic tools include Facebook, Twitter, YouTube, Amazon, Goodreads, SlideShare and GitHub. Academic tools and peer networks comprise institutional repositories, CiteULike, Mendeley, Academia.edu, ResearchGate and Social Science Research Network (SSRN). Altmetrics harvesting tools include Altmetric.com, ImpactStory, PlumX and Kudos. Each one of these tools provides some unique metrics and most are freely accessible if one has access to the internet.

Webometrics data can be obtained from several web-based sources including search engines (for example, Google and Yahoo), meta search engines (for example, Alexa) and Webometrics Ranking of World Universities (WRWU). The WRWU draws its data from Google, Ahrefs or Majestic, Google Scholar and SCImago. A number of web crawlers can also be used to obtain the various metrics associated with the web. Users of search engines are advised to explore how each of the search engines can be used to obtain webometrics data as each search engine provides unique features.

4.3 Support services for researchers in the areas of bibliometrics, webometrics and altmetrics

The need for libraries to support research and researchers in sub-Saharan Africa has never been greater and cannot be overemphasised. Juxtaposed with the increased demand for accountability on the part of researchers and institutions of higher learning is the provision of timely, accurate and quality data to inform decision-making processes on research-based activities and programmes such as grant seeking, academic promotions as well as employment and ranking. There are calls for the increased active participation of libraries and librarians in, not only the conduct of research in metrics, but also to provide metric data to various stakeholders to allow them to make informed decisions.

It is well acknowledged that academic libraries exist to support the core activities of parent institutions, namely teaching and research (Kennan, Corrall & Afzal 2014; Namuleme & Kanzira 2015; Tancheva et al. 2016). Daland and Hidle (2016) have contextualised the phrase "research support" within the library segment to refer to "library services that could increase the efficiency of research" (Daland & Hidle 2016: 63). Similarly, Parker (2012), as cited in Namuleme and Kanzira (2015: 31), has defined research support as "services and facilities which assist in increasing research productivity and scholarship". The services that academic libraries can offer to their users, according to Namuleme and Kanzira (2015), can be classified into two categories, namely, traditional and "new" support services. The former category, according to the authors, would include collection development, selective dissemination of information, current awareness services, information literacy training and open access publishing, while the "new" support services comprise bibliometrics and systematic reviews. Kostos (2016) observed that librarians are an integral part of scientific research although their support for research is not always obvious, nor is it properly appreciated. The author proceeded, nevertheless, to outline the role that librarians play in research as well as the resultant benefits. The author stated that academic librarians can support research through collection development of what the author calls "strong research materials", providing researchers with catalogues of research materials and flexible physical research spaces, managing institutional repositories, cultivating close relationships with researchers and providing them with researcher-focused services. Jaguszewski and Williams (2013) report that liaison librarians view the following as their main support services for research: assisting researchers who are branching out into new disciplines but are unfamiliar with key articles, core journals and potential collaborators; identifying discipline-specific researchers to enhance collaboration; creating faculty profiles; information discovery, management, creation and dissemination; data management, including data curation; and education and consultation services for personal information management.

In terms of bibliometrics and altmetrics support, Gumpenberger, Wieland and Gorraiz (2012) have noted that there are several bibliometrics-related areas which a bibliometrics department (either attached to the library or otherwise) can pursue in support of research and researchers at universities. These activities, some of which can apply to academic libraries, include teaching/training, consultancy, expert analyses, organisation of events, development of partnerships and projects, and scientific output. Roemer and Borchardt (2015a) considered the following to be some of the means through which librarians have positioned themselves to "become major players in the development of the field" of bibliometrics:

library collections - for instance, the application of Bradford's law to make decisions on journal selection;

institutional repositories - repositories, besides offering an open access platform for disseminating research, generate valuable metrics that can be used to measure scholarly impact;

relationships with academic populations - academic librarians' relationship with the rest of the university community transcends disciplinary and administrative boundaries. Librarians have a set of relationships with researchers, administrators, students and publishers; relationships that can create trust which in turn can be leveraged via outreach to highlight complex topics that require the input of disparate populations; and

scholarly and professional knowledge.

Academic librarians' knowledge of scholarly communication and the LIS profession gives them a head start on scholars who are yet to acknowledge the value of impact metrics. Roemer and Borchardt (2015b) added that librarians can play an important role in terms of evaluation of, acquisition of and providing access to appropriate bibliometrics and altmetrics tools, and actively engaging in outreach, training and marketing of these tools. Finally, librarians can communicate and advocate for bibliometrics and altmetrics with various stakeholders, including faculties and researchers, graduate and undergraduate students, administrators, publishers and toolmakers.

Academic librarians in sub-Saharan Africa should be able to carry out basic bibliometrics-related activities to support research and researchers. For instance, they could:

create awareness of different metrics-yielding tools and the metrics used to measure research performance among academics;

identify and suggest the most relevant and high-quality journals in which authors can publish their papers. Librarians can also provide explanations on why certain journals are not suitable places to publish (for example, predatory and hijacked journals);

provide information on the type of metrics that can be used for specific research-related purposes;

explain the meaning of the most important metrics that are appropriate for research evaluation of individual authors, for example, h-index, RG score, g-index, scientific output, citations count;

explain the meaning of bibliometrics and other methodologies associated with bibliometrics;

conduct and publish bibliometrics studies to contribute to the body of knowledge in the subject domain;

advise researchers and the management teams in charge of research in their respective universities on matters such as how to maximise the impact of research;

point academic staff to appropriate mainstream citation indexes, databases and any other sources of bibliometrics and altmetrics data;

organise and conduct workshops, seminars or discussion forums on bibliometrics for academics, postgraduate students and research administrators;

purchase and maintain relevant tools such as WoS, JCR and Scopus;

point out the most important papers on a topic to postgraduate students; and

systematically collate and record research outputs of all researchers in a department, school, or college/faculty. This can be done in partnership with the research administration units within the university.

 

5 Strategies to empower academic librarians to navigate the metrics tide

What strategies can academic librarians use to navigate the metrics tide? There are several ways that academic librarians in sub-Saharan Africa can navigate the metrics tide of the 21st century as they apply bibliometrics and altmetrics to support research in their parent organisations. This paper highlights only seven fundamental strategies or methods.

One of the strategies is for the academic librarians in sub-Saharan Africa to adopt and use bibliometrics and altmetrics sources and tools that are freely accessible and to bring these to the attention of researchers and other interested parties. It is well acknowledged that most academic libraries in sub-Saharan Africa are underfunded (see Ishola 2014); hence they have not been able to subscribe to major sources of bibliometrics data such as WoS and Scopus. Some of the freely available and commonly used sources of citations data, notwithstanding their shortcomings, include Google Scholar, ResearchGate, Mendeley and Academia.edu. There are many free data analysis and visualisation tools - examples are BibExcel, UCINET, VosViewer, Pajek, NetDraw, CiteSpace, TI, TextStat and Sitkis - which must be carefully selected as each of them performs unique analysis and visualisation.

Academic librarians may also want to initiate peer-to-peer bibliometrics programmes within their libraries (see Sidorko

6 Yang 2008; Delasalle 2011: 16). The peer-to-peer bibliometrics programmes would provide researchers with a forum to learn from one another, discuss research-related topics, appropriate metrics and tools to measure research output and impact, and plan research events. In addition, librarians may use the programme to provide one-on-one advisory services on bibliometrics and altmetrics. The advisory services can include bibliometrics and altmetrics-related issues such as where to publish and how to compute and interpret various research metrics.

Academic librarians are strongly encouraged to revise their library instruction programmes to include not only the measurement of research output and impact, but statistical literacy programmes as well. Traditionally, library instruction programmes adopted what Hollister and Coe (2003) term a "one-hour/one-shot class" and/or the "lecture and demonstration method" wherein the services included exposure to academic tools (such as, search strategy, database use, citation style, library catalogue, evaluating internet resources, finding journal articles, call numbers, reference tools, bibliographic management), aspects of information literacy (for example, information ethics, intellectual property, organising information) and orientation to library resources and services (for example, library tour, library reserves, career guides). Appendix B highlights possible areas of library instruction in bibliometrics and altmetrics. New library instruction programmes may include some or all of the areas highlighted in Appendix B. Training researchers on various relevant issues pertaining to research dissemination and measuring output and impact is premised on the adage of "give a man a fish and you feed him for a day; teach a man to fish and you feed him for a lifetime".

Academic librarians' involvement in conducting bibliometrics and altmetrics research will go a long way to enhancing their knowledge and skills in the application of bibliometrics and altmetrics methods, techniques and tools for research support. There have been several bibliometrics and altmetrics research papers emanating from academic librarians, sometimes in collaboration with teaching staff, in Nigeria and South Africa (see, for example, Edewor 2013; Ocholla, Ocholla & Onyancha 2013; Ani, Ngulube & Onyancha 2017; Ezema & Onyancha 2017). This trend may explain the recent good standing of Nigeria in LIS research output as indexed in Scopus. The strength of academic librarians in the areas of bibliometrics and altmetrics make them potential high producers of knowledge in this area.

In the short term, academic librarians in sub-Saharan Africa should develop libguides that specifically provide researchers with information on various aspects of research impact. Libguides should provide information on the concept of impact, measuring impact, maximising research impact, and upcoming programmes or workshops on impact (see, for example, UCI Libraries, 2018).

Furthermore, academic librarians may consider establishing an office or section within the library to deal with the application of bibliometrics and altmetrics in research evaluation. The office would conduct activities such as training researchers on various aspects of maximising research impact, avenues of research dissemination and awareness of predatory and hijacked journals. Other training focus areas would include those listed in Appendix B.

Finally, various authors such as Rice-Lively and Racine (1997), Brewerton (2012), and Cox, Verbaan and Sen (2012) have strongly advocated for continuous up-skilling or re-skilling of academic librarians. For instance, Rice-Lively and Racine (1997: 35) have argued that "the changing nature of the environment demands a commitment to lifelong learning with academic librarians being more self-directed and self-motivated to develop new skills that will enable the fullest use of new technology and resources". Doskatsch's (2007: 460) observations regarding the conduct of "regular and frequent reviews of staffing requirements and the roles of professional librarians" call for librarians to improve their skills, knowledge and expertise to offer effective and appropriate support services to researchers. This training should extend to bibliometrics and altmetrics and may cover aspects highlighted in Appendix B and more. The training can take the form of formal educational programmes (see the content proposed by Schrader 1981), short learning programmes or continuous development programmes. While the former programmes are geared towards librarians attaining a formal qualification, the latter's main objective is to offer opportunities in the non-formal curriculum to enhance academic librarians' competencies, skills and expertise in various subject fields.

 

6 Conclusion

As the nature of research continues to evolve, so must the role of academic libraries in research support (Auckland 2012). The research and information needs of faculties are many, varied and dynamic, requiring a firm understanding of them on the part of academic librarians. Equally varied and evolving are research metrics, which have become increasingly important tools in measuring the value of research in terms of output and impact. Therefore, it follows that appropriate mechanisms and strategies are needed to address the needs of sub-Saharan Africa's researchers in terms of bibliometrics and altmetrics data. As a result, academic librarians in sub-Saharan Africa, as well as their counterparts in the rest of the world, "require a broad overview of researchers' needs across disciplines and the scope to design new services for researchers based on the changing landscape" (Parker 2012 cited in Namuleme & Kanzira 2015: 31). The LIS schools in sub-Saharan Africa are called upon to partner with academic libraries to develop and execute programmes in bibliometrics and altmetrics as well as research evaluation to support different stakeholders interested in the assessment of research output and impact. It is an indisputable fact that academic librarians in sub-Saharan Africa have a big role to play in support for research and researchers. Their active participation in supporting research will result in several benefits such as improved research practices, better informed researchers and increased visibility of research, improved institutional understanding of information assets, better research management, improved coordination of research activities, and a good reputation as an institution for research (Kostos 2016). Kostos (2016) further stated that the benefits should include increased potential readership of research, more research income, high quality of research, recruitment and retention of high-quality researchers, more efficient research, more motivated and satisfied researchers, and high research outputs and impacts.

In conclusion, how should academic librarians navigate the rising metrics tide? Academic librarians should be familiar with the most commonly-applied metrics, methodologies and tools in bibliometrics, cybermetrics, webometrics and altmetrics. They should also be able to apply each of them depending on the type and nature of the request placed by interested parties, be they academic staff, research administrators, university administrators or grant offices. For instance, a request from the university administrator regarding the webometrics "standing" (ranking) of a university may require the academic librarian to provide such metrics as overall rank in the geopolitical zone within which the university is located, rank in the world, presence rank, visibility/impact rank, transparency/openness rank and excellence rank. One may go further and provide the specific metrics that led to the ranking of the university in question, for example, the number of citations in Google Scholar (used to measure transparency/openness) and backlinks using Ahrefs and Majestic (or any other sources of data). If the request is about the citation performance of an article, then the most appropriate metrics would be article-level metrics which could include citations count, citations per year, citations per author (where co-authored), and altmetrics such as number of downloads, views, Facebook posts, tweets and retweets, saves, news mentions and readers.

 

References

Abubakar, B.M. 2011. Academic libraries in Nigeria in the 21st century. Library Philosophy and Practice (e-journal), Paper 446: 1-5. [Online]. http://digitalcommons.unl.edu/libphilprac/446 (2 June 2017).         [ Links ]

Alzahrani, S. 2010. The role of editorial boards of scholarly journals on the green and the gold road to open access. DPhil thesis. University of British Columbia.         [ Links ]

Andras, P. 2011. Research: metrics, quality, and management implications. Research Evaluation, 20(2): 90-106.         [ Links ]

Andrews, A., Downs, A., Morris-Knower, J., Pacion, K. and Wright, S.E. 2016. From 'library as place' to 'library as platform': redesigning the 21st century academic library. In The future of library space. S. Schmehl Hines and K. Moore Crowe, Eds. Bingley, West Yorkshire: Emerald. 145-167. DOI:10.1108/S0732-067120160000036006.         [ Links ]

Ani, E.O., Ngulube, P. and Onyancha, O.B. 2017. A study of productivity of academic staff in selected Nigerian universities. Nigerian Libraries: Journal of the Nigerian Library Association, 47(2): 77-94.         [ Links ]

Auckland, M. 2012. Re-skilling for research: an investigation into the role and skills of subject and liaison librarians required to effectively support evolving information needs of researchers. Stoke-on-Trent: RLUK. [Online]. http://www.rluk.ac.uk/wp-content/uploads/2014/02/RLUK-Re-skilling.pdf (4 June 2017).

Beamish, P.W. 2006. Publishing international (joint venture) research for impact. Asia Pacific Journal of Management, 23(1): 29-46.         [ Links ]

Borgman, C.L. 2010. Research data: who will share what, with whom, when, and why? Paper presented at the Fifth China-North American Library Conference. 8-12 September. Beijing. [Online]. http://syd-ney.edu.au/research/data_policy/re-sources/ANDS_Borgman_2010_re-search_data.pdf (21 May 2015).

Brewerton, A. 2012. Re-skilling for research: investigating the needs of researchers and how library staff can best support them. New Review of Academic Librarianship, 18(1): 96-110.         [ Links ]

Corrall, S, Kennan, M.A. and Afzal, W. 2013. Bibliometrics and research data management services: emerging trends in library support for research. Library Trends, 61(3): 636-674.         [ Links ]

Council for the Development of Social Science Research in Africa (CODESRIA). 2016. About CODESRIA African Citation Index. Dakar: CODESRIA. [Online]. https://www.codesria.org/spip.php?article2669 (7 November 2018).

Cox, A, Verbaan, E and Sen, B. 2012. Upskilling liaison librarians for research data management. Ariadne, 30 November. [Online]. http://www.ariadne.ac.uk/issue70/cox-et-al?utm_source=rss&utm_medium=rss&utm_campaign=upskilling- liaison-librarians-for-research-data-management-2 (19 June 2017).

Daland, H.D. and Hidle, K.M.W. 2016. New roles for research librarians: meeting the expectations for research support. Amsterdam: Chandos Publishing.         [ Links ]

Deegan, M and Tanner, S. 2002. Digital futures: strategies for information age. London: Library Association Publishing.         [ Links ]

Delasalle, J. 2011. Research evaluation: bibliometrics and the librarian. SCONUL Focus, 53: 15-19.         [ Links ]

Delasalle, J. 2017. Bibliometrics and the academic librarian. A librarian abroad. 29 May. [Online]. https://jennydelasalle.wordpress.com/2017/05/29/bibliometrics-and-the-librarian/ (4 July 2017).

Doskatsch, I. 2007. From flying solo to playing as a team: evolution of academic library services teams at the University of South Australia. Library Management, 28(8/9): 460-473.         [ Links ]

Edewor, N. 2013. An analysis of a Nigerian Library and Information Science journal: a bibliometric analysis. Library Philosophy and Practice (e-journal), 1004. [Online]. http://digitalcommons.unl.edu/libphilprac/1004 (4 June 2017).         [ Links ]

Ezema, J.I. and Onyancha, O.B. 2017. Citation impact of health and medical journals in Africa: does open accessibility matter? The Electronic Library, 35(5): 934-952. DOI:10.1108/EL-11-2016-0245.         [ Links ]

Goetsch, L.A. 2008. Reinventing our work: new and emerging roles for academic librarians. Journal of Library Administration, 48(2): 157-172.         [ Links ]

Gumpenberger, C., Wieland, M. and Gorraiz, J. 2012. Bibliometrics practices and activities at the University of Vienna. Library Management, 33(3): 174-183.         [ Links ]

Hanard, S. 2001. The self-archiving initiative: freeing the refereed research literature online. Nature, 410: 1024-1025.         [ Links ]

Hanard, S., Carr, L. and Brody, T. 2001. How and why to free all refereed research from access- and impact-barriers online, now. High Energy Physics Libraries Webzine. [Online]. http://cogprints.org/1640/1/science.htm (12 October 2015).

Hollister, C.V. and Coe, J. 2003. Current trends vs. traditional models. College & Undergraduate Libraries, 10(2): 49-63.         [ Links ]

Holmberg, K. 2016. Altmetrics for information professionals: past, present and future. Amsterdam: Elsevier.         [ Links ]

Housewright, R., Schonfeld, R. and Wulfson, K. 2013. UK survey of academics. [Online]. http://www.rluk.ac.uk/wp-content/uploads/2014/02/UK_Survey_of_Academics_2012_FINAL.pdf (2 June 2017).

Ingwersen, P. 1998. The calculation of web impact factors. Journal of Documentation, 54(2): 236-243.         [ Links ]

Ishola, B.C. 2014. Funding problems in Nigerian university libraries: fee based library and information services to the rescue, focus on pricing policy. Library Philosophy and Practice (e-journal), 1176. [Online] http://digitalcommons.unl.edu/libphilprac/1176 (2 June 2017).

Jaguszewski, J.M. and Williams, K. 2013. New roles for new times: transforming liaison roles in research libraries. Washington, DC: Association of Research Libraries.         [ Links ]

Kebede, G. 2002. The changing information needs of users in electronic information environments. The Electronic Library, 20(1): 14-21.         [ Links ]

Kennan, M.A., Corrall, S. and Afzal, W. 2014. "Making space" in practice and education: research support services in academic libraries. Library Management, 35(8/9): 666-683.         [ Links ]

Klobas, J.E. 1999. Networked information resources: electronics opportunities for users and libraries. OCLC Systems, 15(2): 10-11.         [ Links ]

Kostos, D. 2016. The role of academic librarians in research. [Online]. http://www.jove.com/blog/2016/08/03/the-role-of-academic-librarians-in-research (15 June 2017).

Library Journal and Gale CENGAGE Learning. 2015. Bridging the librarian-faculty gap in the academic library. [Online]. https://s3.amazonaws.com/WebVault/surveys/LJ_AcademicLibrarySurvey2015_results.pdf (19 June 2017).

Mamtora, J. 2013. Transforming library research services: towards a collaborative partnership. Library Management, 34(4/5): 352-371.         [ Links ]

Melchionda, G.M. 2007. Librarians in the age of the internet: their attitude and their roles. New Library World, 108 (3/4): 123-140.         [ Links ]

Namuleme, R.K. and Kanzira, A.N. 2015. Research support services in academic libraries in Uganda. In: The quest for a deeper meaning of research support. R. Raju, A. Adam, G. Johnson, C. Miller, and J. Pietersen, Eds. Cape Town: University of Cape Town Libraries. DOI:10.15641/0-7992-2526-6.         [ Links ]

Noruzi, A. 2006. The web impact factor: a critical review. The Electronic Library, 24(4): 490-500.         [ Links ]

Obegi, F.M. and Nyamboga, CM. 2011. The changing and evolving roles of information professionals in the digital age. Paper presented at the KLA annual Conference Mombasa: Kenya. [Online]. https://erepository.mku.ac.ke/handle/123456789/3863 (23 May 2017).

Ocholla, D.N., Ocholla, L. and Onyancha, O.B. 2013. Insight into research publication output of academic librarians in Southern African public universities from 2002 to 2011. African Journal of Libraries, Archives & Information Science, 23(1): 5-22.         [ Links ]

Olivia, C. 2007. The role of librarians: past and future. D-Lib Magazine, 15(6): 22-25.         [ Links ]

Parker, R. 2012. What the library did next: strengthening our visibility in research support. Paper presented at VALA2012

16th Biennial Conference. 6-9 February 2012. Melbourne, Australia. [Online]. https://www.vala.org.au/vala2012- proceedings (25 May 2015).

Priem, J., Groth, P. and Taraborelli, D. 2012. The altmetrics collection. PLoS ONE, 7(11): e48753. DOI:10.1371/journal.pone.0048753.         [ Links ]

Ravenscroft, J., Liakata, M., Clare, A. and Duma, D. 2017. Measuring scientific impact beyond academia: an assessment of existing impact metrics and proposed improvements. PLoS ONE, 12(3): 1-21. DOI:10.1371/journal.pone.0173152.         [ Links ]

Rice-Lively, M.L. and Racine, J.D. 1997. The role of academic librarians in the era of information technology. The Journal of Academic Librarianship, 23(1): 31 -41.         [ Links ]

Roemer, R.C. and Borchardt, R. 2015a. Meaningful metrics: A 21st-century librarian's guide to bibliometrics, altmetrics, and research impact. Chicago, IL: Association of College and Research Libraries.         [ Links ]

Roemer, R.C. and Borchardt, R. 2015b. Altmetrics and the role of librarians. Library Technology Report, 51(5): 31-37.         [ Links ]

Roemer, R.C. and Borchardt, R. 2015c. Major altmetrics tools. Library Technology Report, 51(5): 11-19.         [ Links ]

Schouten, T. 2010. R&D 1: quality and metrics: how to measure and predict software engineering. [Online]. http://www.cs.ru.nl/~ths/sdm1/theo2010/Metrics.ppt (24 May 2017).

Schrader, A.M. 1981. Teaching bibliometrics. Library Trends, 30(1): 151-172.         [ Links ]

Sidorko, P.E. and Yang, T.T. 2008. Refocusing for the future: meeting user expectations in a digital age. Library Management, 30(1/2): 6-24.         [ Links ]

Tancheva, K., Gessner, G.C., Tang, N., Eldermire, E., Furnas, H., Branchini, D., Steinhart, G., Foster, N.F. 2016. A day in the life of a (serious) researcher: envisioning the future of the research library. New York: Ithaka [Online]. http://www.sr.ithaka.org/wp-content/uploads/2016/03/SR_Report_Day_in_the_Life_Researcher030816.pdf (24 May 2017).         [ Links ]

Thelwall, M. 2004. Link analysis: an information science approach. London: Elsevier Academic Press.         [ Links ]

Thelwall, M. 2009. Introduction to webometrics: quantitative web research for the social sciences. Willinston: Morgan & Claypool.         [ Links ]

Thelwall, M. 2010. Webometrics. In Encyclopedia of Library and Information Science. 3rd ed. 5634-5643. DOI:10.1081/E- ELIS3-120044480.

Thelwall, M. 2014. Five recommendations for using alternative metrics in the future UK Research Excellence Framework. LSE blog. 23 October. [Online]. http://blogs.lse.ac.uk/impactofsocialsciences/2014/10/23/alternative-metrics-future-uk-research-excellence-framework-thelwall/ (2 June 2017).

UCI Libraries. 2018. Research impacts using citation metrics: Home. [Online]. https://guides.lib.uci.edu/researchimpact-metrics (7 November 2018).

Wilsdon, J. and Belfiore, E. 2015. The metric tide: report of the independent review of the role of metrics in research assessment and management. Bristol: HEFCE. DOI:10.13140/RG.2.1.4929.1363.         [ Links ]

Wolff, C. 2016. Canadian Association of Research Libraries faculty survey: executive summary of findings. New York: Ithaka.         [ Links ]

Zenke, P. 2012. The future of academic libraries: an interview with Stephen Bell. Education Futures. [Online]. https://educationfutures.com/blog/2012/03/the-future-of-academic-libraries-an-interview-with-steven-j-bell/ (19 July 2017).

 

 

Received: 30 May 2018
Accepted: 17 November 2018

 

 

Appendix A

Job description of the Bibliometrics and Research Impact Librarian, University of Waterloo_

Bibliometrics and Research Impact Librarian (13 month contract) University of Waterloo Library

The University of Waterloo Library seeks a creative and collegial librarian for the position of Bibliometrics and Research Impact Librarian (contract). Reporting to the Associate University Librarian, Rcscaxn St Digital Discovery Services, the incumbent is accountable for developing, advancing, implementing and supporting a range of services around scholarly output research impact and bibliometrics for the University of Waterloo community. Supporting the "Transformational Research" campus Strategic Theme and the "Advancing Research and Scholarship" Library strategic direction, the incumbent works in close collaboration with staff in Institutional Analysis and Planning (IAP), the Office of Research (OR), and colleagues witn η the Lb-ary to develop and support services around measurement of research impact for the campus.

The University of Waterloo Library is a member of the Canadian Association of Research Libraries (CARL), the Association of Research Libraries (ARL), the Coalition for Networked Information (CNI), and the Ontario Council of University Libraries (OCUL), The Library is a member of the Tri-University Group (TUG), along with member libraries at the University of Guelph and Wilfrid Laurier University, serving a combined student population of over 76,000, combined holdings exceeding 5 million, and combined budgets exceeding ¢45 million. The Tri-University Group manages shared technology and a remote storage facility, and enables collaborative projects across the three library systems.

Qualifications:

Education: A LA-accredited Master of Library & Information Science degree, or equivalent Experience:

Demonstrated experience with different bibliometric data sources having citation tracking capabilities (e.g. Web of Science, Scopus, Google Scholar, altmetrics, patent databases)

Knowledge of and/or experience with bibliometric software such as JnCitcs and SciVal

Demonstrated experience with mathematical and statistical methods associated with bibliometrics - see Technical sect:on, below

Excellent communication and diplomacy skills: ability to converse articulately and persuasively with University administrators and faculty members {such as Bibliometrics Working Group members, deans, chairs of department, faculty members and staff in faculties responsible for research impact measurement and bibliometrics)

Ability to take initiative, create opportunities, and develop effective partnerships

Able to work independently and as part of a tcgn

Excellent written and verbal skills for teaching, troubleshooting, procedural documentation and demonstrated cxper erce witn report writ ng

Demonstrated experience with instructional development and delivery

Knowledge of :ssues related to rcscErcher identification and experience using researcher profiling systems (e.g. ORCID, Researcher ID, etc.)

Data management, manipulation, interpretation and analysis skills

Information analysis and synthes s exper ence

Project management experience

Reference management software expertise in tools such as RcfWorks

"cennica :

Comprehensive knowledge of bibliometrics and other research impact measures

Depth - from the metrics and issues of individual researchers up to and including institutional level metrics

Breadth - knowledge of bibliometrics and other research impact measures and how they impact or are effective/no π-effective for all disciplines and sub disciplines

Understanding of the issues related to appropriate use of bibliometric tools, an awareness of the potcntia misuse of tnese tools and misinterpretation of the results

Demonstrated knowledge of basic and derivative metrics (publication counts, citation counts, h-index, journal or discipline normalized metrics, other index metrics)

Knowledge of other methods or approaches to measuring research impact in fields in which standard STEM bibliometrics are nota good fit

Good understanding of citation analysis, network analysis, science mapping and societal impact of research (rationales & methodologies & limitations)

Conversant with issues -clstcd to the quantification of research impact

 

Appendix B

Possible areas of library instruction as well as short learning programmes in bibliometrics and altmetrics for librarians and researchers

Sources of informetrics data and search strategies (e.g. sources of informetrics data such as information bibliographic databases [e.g. EBSCOHost databases]); citation indexes (e.g. African Citation Index, Web of Science, Scopus, Google Scholar, ScieLO); sources of altmetrics data (e.g.

ImpactStory, Academia.edu, ResearchGate, Altmetric.com, CiteULike, Mendeley, institutional repositories); social media (e.g. Facebook, Twitter, YouTube, Good Reads).

Most commonly used research output and impact metrics (see a section on metrics used to evaluate and measure research output and impact) and strategies to maximize research impact

Computer-aided software used to analyse and visualise data in informetrics (software used to obtain frequency of occurrence, e.g. BibExcel, TextStat, Microsoft Office Excel); mapping and visualising scholarly communication (e.g. CiteSpace, VosViewer, Pajek, Netdraw, UCINET, Word Cloud); visualising webometrics data (e.g. SocSciBot); Google citation data extraction software (e.g. Publish or Perish)

Interpreting and validating research metrics and informetrics findings for decision-making and policy-formulation processes in different contexts (e.g. universities, libraries, governments)

How informetrics is applied in different contexts (e.g. research evaluation, research support, selective dissemination of information [alerts] and collection development in libraries, funding and grant seeking, ranking systems, etc)

Scholarly communication past, present and future (including the sociology of science; open access movement; online publishing and consumption of research, e.g. through blogs and social networking sites)

Statistical literacy: basic statistics for librarians such as descriptive statistics, which are commonly used in bibliometrics and altmetrics studies, correlational statistics as applied in different contexts, interpretation of statistical studies (e.g. indicators most interesting to researchers and other stakeholders, when the statistics were compiled, periodicity of statistics, accessibility of statistics); social network analysis metrics (e.g. centrality, density, proximity, distance measures)

Introduction to informetrics: bibliometrics, scientometrics, webometrics, cybermetrics and altmetrics; descriptive and evaluative informetrics; classic informetrics laws; article-level and author-level metrics; journal-based measurements, science and technology indicators; impact indicators and their meaning for researchers and other stakeholders

Designing and conducting informetrics analyses or studies (step-by-step procedures and design); identification of problem; selection of items and units of analysis; choice of appropriate informetrics method or technique - publications count or citation/altmetrics analysis; data collection procedures and methods; data analysis tools; selection of appropriate data presentation methods; etc

Global ranking systems (Academic Ranking of World Universities, Centre for World University Ranking, Professional Ranking of World Universities, QS World University Rankings, Reuters World's Top 100 Innovative Universities, SCImago Institutions Rankings, Times Higher Education World Reputation Rankings, Times Higher Education [THE] Ranking of World Universities, and Webometrics of World Universities, Shanghai) and indicators used in ranking world universities

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License