SciELO - Scientific Electronic Library Online

 
vol.111 issue9-10First oceanographic survey of the entire continental shelf adjacent to the northern Agulhas CurrentBack to basics author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


South African Journal of Science

On-line version ISSN 1996-7489
Print version ISSN 0038-2353

S. Afr. j. sci. vol.111 n.9-10 Pretoria Sep./Oct. 2015

http://dx.doi.org/10.17159/SAJS.2015/A0121 

COMMENTARY

 

South African scholars make Thomson Reuters 'Highly Cited Researchers 2014'

 

 

Makia L. Diko

School of Physical and Mineral Sciences, University of Limpopo, Polokwane, South Africa

Correspondence

 

 

Evaluating individual and institutional scientific performance is an essential component of research assessment, and outcomes of such evaluations play a key role in institutional research strategies, including funding schemes, staffing and international recognition.1 In a recent communication by Slippers et al.2 entitled 'Global trends and opportunities for development of African research universities', featured in the January/February 2015 edition of the South African Journal of Science, they observed that:

The demand for demonstrating the relevance and impact of research at higher education institutions is increasing at the same time, particularly in developing nations in which funders are becoming impatient with a perceived lack of results.

In light of the above assertion, one may be forced to ask what constitutes relevant research or how is research impact assessed? Providing a succinct response to these questions seems to be a tall order. Indeed, the quest for comprehensive criteria to assess scholarly outputs such as research publications continues to dominate academic discourse across the globe.1,3-6 From the affluent 'North' to the developing 'South', educational systems of several countries1,3-6, including South Africa7-10 continue to grapple with the issues around quality of research output. Efforts at evaluating productivity, scientific impact and research quality are compounded by the seeming lack of a general consensus on an acceptable metric system by all related stakeholders (e.g. governments, academic institutions and publication agencies) within and across educational systems. In addition, variability between metric systems (inclusive of pros and cons)6-8 and continuous emergence of new indices, coupled with inherent differences in socioeconomic and resource potentials between the 'North' and 'South', makes the conceptualisation of a globally acceptable definition of 'relevance' and 'impact' of research even more complex. Despite the above constraints, academic institutions from the 'South' (especially African universities) in search of best practice in research, are encouraged to benchmark with trends that they obtain from the 'North'.11-13

In South Africa, research publications are one of the instruments used to monitor the performance of institutions of higher learning. As an incentive towards increasing research output, the Department of Higher Education and Training (DHET) - through the 'Policy for Measurement of Research Outputs of Public Higher Education Institutions (2003)' - awards subsidy to higher education institutions whose members publish in an approved list of South African journals. Amongst this list is a significant cross-section of journals included on the Thomson Reuters Web of Science index (formerly known as ISI). In December 2014, the Web of Science published a list of over 3000 researchers from across the world with the most cited publications over an 11-year period (2002-2012).14 This list featured ten researchers with affiliation to a South African university. The aim of this article is to celebrate the South African scholars whom, together with fellow listees, have been deemed the 'most influential scientific minds' by Web of Science14, on the basis of peer recognition through citations. This study also elucidates the global distribution of these researchers as well as South Africa's performance within the global context. It concludes by interrogating the implications of this 'South African achievement' on the National Research Foundation (NRF) researcher rating system and current DHET publication subsidy policy.

 

The selection procedure for highly cited researchers

The list of highly cited researchers was drawn from highly cited articles and reviews in science and social sciences journals indexed in the Web of Science Core Collection during the 11-year period 2002-2012. Highly cited papers were defined as those that rank in the top 1% by citations per field based on data derived from Essential Sciences Indicators® (ESI). (For more detailed information on the analytical and selection procedures, consult the official website: www.highlycited.com)

With respect to the current study, data analysis and interpretation were based on the total number of individual researchers (F1) as well as those listed in more than one ESI category or country (F2).15 In terms of percentage distribution of researchers per country, weightings have been computed based on F1 (3073) and F2 (3215).

 

Synopsis of findings

The 3073 individually cited researchers are drawn from 47 countries. A distribution of these researchers by country is presented in Table 1. Results reveal that the bulk of influential researchers are based in institutions from the United States of America (USA) with a total of 1667 researchers, followed by the United Kingdom (UK) with 360 and Germany (271) as distant second and third respectively. Saudi Arabia (174), China (161), France (140) and Japan (102) are close fourth, fifth, sixth and seventh respectively, while Canada (89), Netherlands (82) and Switzerland (68) complete the top ten countdown. South Africa is the sole representative from Africa, ranking 25th out of 47 countries. Based on F2 criteria, North America contributed 48.8% of the most influential researchers, followed by Europe (46.89%), Asia (15.17%), Oceania (1.11%), Africa (0.3%) and 0.25% for South America. In terms of performance in relation to the BRICS countries (Brazil, Russia, India, China and South Africa), South Africa with 11 highly cited researchers ranked third, closely behind India (12) and China (161) whereas, Russia and Brazil had 7 and 5, respectively.

 

 

South Africa's most influential scientific minds

In terms of individual researchers, there are actually 10 researchers with South African affiliation (one of them being listed in two ESI Categories). They comprise six South African based scholars and four others with secondary affiliation to a South African university (Table 2). The following section briefly presents the short list of six, comprising three listees in the Environment/Ecology category, two in Social Sciences - General, and one in the Biology and Biochemistry category. More elaborate bibliographical information is hosted by their respective institutional websites.

David M. Richardson is an A1 NRF-rated researcher16 and leading international scholar in the field of invasion biology. He is currently Director of the Department of Science and Technology (DST)/NRF Centre of Excellence in Invasion Biology at Stellenbosch University. He has published over 307 peer-reviewed articles in scientific journals and books, including chapters in 40 edited books. According to the Web of Science, his works have been cited 11 230 times.

Guy Midgley is a B1 NRF-rated researcher16 and internationally acknowledged expert in the field of biodiversity and global change science. He has published more than 160 articles and papers, of which four have been in the top academic journals Nature, Science, and Nature Climate Change. His academic works have been cited more than 12 000 times.

William J. Bond is a B3 NRF-rated researcher16 and Emeritus Professor in the Department of Botany at the University of Cape Town. His niche areas includes: processes influencing vegetation change in the past and present, including fire, vertebrate herbivory, climate extremes, atmospheric (CO2) and habitat fragmentation; plant-animal interactions; plant form and function; and biomes.

Lyn Wadley is an A2 NRF-rated researcher16 and Honorary Professor of Archaeology, affiliated jointly with the Archaeology Department (University of the Witwatersrand) and the Institute for Human Evolution. She is also the Director of Ancient Cognition and Culture in the Africa Research Unit at the University of the Witwatersrand. The group's research focuses on issues of cognition and culture in the Middle Stone Age of southern Africa.

Rachel K. Jewkes is an A2 NRF-rated researcher16 and Director of the Medical Research Council's Gender and Health Research Unit. She is Honorary Professor in the Faculty of Health Sciences, School of Public Health (University of the Witwatersrand). Her research focuses on the interface of gender inequity and gender-based violence and health, particularly HIV. She has authored well over 100 articles in peer-reviewed journals and over 20 book chapters.

Nicola J. Mulder is a B3 NRF-rated researcher16 and Head of the Computational Biology Group at the University of Cape Town. Her main research interests lie in the areas of infectious diseases and human genetics, with particular emphasis on the molecular biology of the pathogen Mycobacterium tuberculosis. Under her leadership, the Computational Biology Group generated over 30 publications in 2011.

 

Implications for South Africa

From a South African perspective, the implications of this recent achievement on NRF ratings and DHET publication subsidy policy are interrogated. Beyond right or wrong, this article seeks to generate questions that may encourage and sustain more rigorous debate around these topical issues. It is anticipated that this paper may serve as an impetus towards more in-depth appraisal by all related stakeholders.

Implications on National Research Foundation rating

The six South African based researchers have NRF ratings between A1 and B3.16 The NRF rating system is an international benchmarking process through which individuals that exemplify the highest standards of research, as well as those demonstrating strong potential as researchers, are identified by an extensive network of South African and international peer reviewers. Ratings are based on the quality and impact of recent research outputs (over an 8-year period).16 Taking into consideration the recent achievement of the six aforementioned researchers, the obvious questions would be:

  • Are their NRF ratings a true reflection of the quality and impact of their research outputs?
  • Is it possible for a researcher in the top 1% of their field globally to be classified as B rated?
  • How consistent is the rating system in terms of meeting its objective - to identify, encourage and celebrate research excellence through quality and impact of research output?
  • How significantly different are the phrases: 'all reviewers', 'overwhelming majority of reviewers' and 'most reviewers', as applied in the description of NRF categories?
  • Alternatively, does the NRF have a comparably more rigorous and reliable evaluation scheme that needs to be projected and adopted by the rest of the world?

Implications on the publication subsidy policy

What really matters: quality or quantity? Publication subsidy is an invaluable source of institutional support from government. In addition, the financial benefits to individual researchers cannot be overemphasised. However, beyond these direct benefits, several studies have suggested negative impacts of the current DHET publication subsidy policy on the quality of research output7-10 (notably with regard to journal articles). Despite inclusion of top-ranking high impact factor journals as part of its accredited list, researchers are not compelled to publish in them. This is further exacerbated by the option of low impact journals, often characterised by a comparatively less rigorous review process and shorter turnaround period (i.e. from initial submission to publication).

In terms of the current DHET remuneration policy, emphasis is placed on units of publication - somewhat synonymous to quantity. For example, a journal article published by a single researcher affiliated to a South African higher educational institution is worth one unit. Irrespective of the type of journal (high or low impact, local or international), the researcher is entitled to one unit and the associated financial gains. Where two or more authors with affiliations to the same or different South African institutions are involved, the single unit is shared among them. Similarly, where two South African based researchers co-publish with two other researchers without affiliation to a South African university, the former are only entitled to 0.5 units (i.e. 0.25 each). Based on the above provision, Woodiwiss10 argued that international collaboration is seriously discouraged. Furthermore, Jeenah and Pouris8 posited that the quest for financial gain tends to encourage quantity at the expense of quality. According to Valerie Mizrahi17, in a lecture on 'The practice of research and publication in the South African context' during the University of Cape Town's Library Research Week (12 May 2014), the current DHET policy 'penalises collaboration' and is 'open to abuse as a numbers game'. In light of the above challenges, the sole question one may pose is: Is it not time to introduce another variable, for example the Quality Index, to the calculation of research units and corresponding financial reward? Such a variable may take into consideration key quality control elements such as the journal impact factor, journal ranking or number of article non-self-citations over the conventional 2-year cycle prior to the release of funds by DHET.

In a nutshell, as we ponder on these issues and more, let us continue to reflect on the sentiments of Salmi13, as captured in his book, The challenge of developing world class universities:

...institutions will inevitably, from here on out, be increasingly subject to comparisons and rankings, and those deemed to be the best in these rankings of research universities will continue to be considered the best in the world.

 

Acknowledgement

Data on Highly Cited Researchers 2014 are freely available on the website www.highlycited.com. Excerpts of bibliographic data presented herein are courtesy of the researcher's institutional websites.

 

References

1. Sahel J. Quality versus quantity: Assessing individual research performance. Sci Transl Med. 2011;3(84cm13):1-4.         [ Links ]

2. Slippers B, Vogel C, Fioramonti L. Global trends and opportunities for development of African research universities. S Afr J Sci. 2015;111(1/2), Art. #a0093, 4 pages. http://dx.doi.org/10.17159/sajs.2015/a0093        [ Links ]

3. Paul JR. Measuring research quality: The United Kingdom government's research assessment exercise. Eur J Inf Syst. 2008;17:324-329. http://dx.doi.org/10.1057/ejis.2008.31        [ Links ]

4. Schreiber M. Twenty Hirsch index variants and other indicators giving more or less preference to highly cited papers. Ann Phys. 2010;522:536-554. http://dx.doi.org/10.1002/andp.201000046        [ Links ]

5. Derrick GE, Haynes A, Chapman S, Hall WD. The Association between four citation metrics and peer rankings of research influence of Australian researchers in six fields of public health. PLoS ONE. 2011;6(4):e18521.         [ Links ]

6. Dodson MV Duarte M, Dias LA. SP-index: The measure of the scientific production of researchers. Biochem Biophys Res Commun. 2012;425:701-702. http://dx.doi.org/10.1016/j.bbrc.2012.07.161        [ Links ]

7. Jacobs D. Analysis of scientific research in selected institutions in South Africa: A bibliometric study. SA Jnl Libs Info Sci. 2006;72(1):72-77.         [ Links ]

8. Jeenah M, Pouris A. South African research in the context of Africa and globally. S Afr J Sci. 2008;104(9/10):351-354.         [ Links ]

9. Schulze S. Academic research at a South African higher education institution: Quality issues. S Afr J Higher Educ. 2008;22(3):629-643.         [ Links ]

10. Woodiwiss AJ. Publication subsidies: Challenges and dilemmas facing South African researchers. Cardiovasc J Afr. 2012;23(8):421-427.         [ Links ]

11. Teferra D, Altbach PG. African higher education: Challenges for the 21st century. High Educ. 2004;47:21-50. http://dx.doi.org/10.1023/B:HIGH.0000009822.49980.30        [ Links ]

12. Sawyerr A. African universities and the challenge of research capacity development. J Higher Educ Afr. 2004;2(1):211-240.         [ Links ]

13. Salmi J. Directions in development-human development: The challenge of establishing world-class universities. Washington DC: The World Bank; 2009.         [ Links ]

14. Thomson Reuters. Welcome to Highly Cited Researchers [webpage on the Internet].         [ Links ] c2014 [cited 2015 Mar 25]. Available from: http://highlycited.com

15. Bornmann L, Bauer J. Which of the world's institutions employ the most highly cited researchers? An analysis of the data from highlycited.com. J Assoc Inf Sci Technol. 2015 January 08. http://dx.doi.org/10.1002/asi.23396        [ Links ]

16. National Research Foundation. NRF Rating. National Research Foundation [webpage on the Internet].         [ Links ] c2015 [cited 2015 Mar 25]. Available from: http://www.nrf.ac.za/rating

17. Mizrahi V. The practice of research and publication in the South African context. Lecture by Prof. Valerie Mizrahi, University of Cape Town, 2014 May 12. Available from: http://researchcommonsblog.uct.ac.za/wp-contentyuploads/2014/05/Practice-research-publication-SA-context_12-05-14_mizrahi.pdf        [ Links ]

 

 

Correspondence:
Makia Diko
School of Physical and Mineral
Sciences, University of Limpopo
Private Bag X1106
Sovenga 0727
South Africa
Email: dikom73@gmail.com

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License