SciELO - Scientific Electronic Library Online

 
vol.118 número1-2François Levaillant: Explorer and biologistDung beetles do the dirty work for the planet with style and charisma índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • En proceso de indezaciónSimilares en Google

Compartir


South African Journal of Science

versión On-line ISSN 1996-7489
versión impresa ISSN 0038-2353

S. Afr. j. sci. vol.118 no.1-2 Pretoria ene./feb. 2022

http://dx.doi.org/10.17159/sajs.2022/12260 

PERSPECTIVE

 

NRF ratings and h-index for engineers: Are we missing the point?

 

 

Charles J. MacRobert; Theo J. Stergianos

Department of Civil Engineering, Stellenbosch University, Stellenbosch, South Africa

Correspondence

 

 


ABSTRACT

Significance:

For the few rated engineers, NRF ratings show a close correlation with ή-index.

Publication of journal articles and the citation thereof may be poor indicators of engineering research impact.

Keywords: bibliometrics, Hirsch's h-index, NRF rating, research administration


 

 

Introduction

Polanyi1 showed how scientific progress relies on a system of 'mutual control', or simply the means by which 'scientists keep watch over each other' to prevent resources being grossly wasted. In our present context, two ways in which this 'mutual control' is exercised is through peer-reviewed ratings from the South African National Research Foundation (NRF) and Hirsch's bibliometric h -index.2

Application for an NRF rating requires individuals to prepare portfolios. These portfolios are peer reviewed and individuals are given an A-rating (leading international researchers), B-rating (internationally acclaimed researchers) or C-rating (established researchers). The subsidiary ratings for young researchers are not considered here. Whilst guidelines for what can be included in these portfolios have been tailored to specific disciplines, emphasis across all disciplines is placed on peer-reviewed journal papers.3 Other outputs (e.g. confidential reports of applied research for industry in engineering) require considerably more justification and are therefore assessed with greater subjectivity. Nevertheless, this subjectivity is perhaps the main advantage of the NRF-rating system as reviewers can assess individuals holistically and in context.4

Hirsch's h-index (where his the number of publications cited htimes) is significantly easier to obtain as it is based purely on bibliometrics. The ή-index is advantageous as it measures the impact of a body of work better than relying on total publications or total citations. Although ή-indices are based on a formula, they can differ based on the database used. For instance, Google Scholar ή-indices tend to be higher than Scopus ή-indices, as the former's database is generated by an algorithm scouring the web, whereas the latter is based on a smaller curated database. The main disadvantage of the ή-index is that it is a metric that does not consider the context of a researcher, which can both unduly inflate an individual's ή-index or fail to capture an individual's impact.4

Despite differences, it is unsurprising that a relationship exists between the two metrics. Johnson4 compared 614 NRF-rated biological scientists to their respective h-indices in 2020 and found fairly distinct ranges of h-index associated with each NRF rating. In 2013, Fedderke5 also found a significant statistical relationship between NRF ratings and ή-index for 1932 scholars across various disciplines, including engineering. Nevertheless, it is considered worthwhile reassessing this relationship for engineers as citation practices are rapidly changing4 and engineering faculties increasingly emphasise research, particularly in promotion criteria.

 

Method

Following the same methodology as Johnson4, the latest NRF ratings (23 March 2021) were downloaded from the NRF website (www.nrf.ac.za) and Scopus h-indices were obtained for individuals listing 'Engineering sciences' as one of their primary disciplines (accessed over 16/17 August 2021). Sub-disciplines were not considered, as it was questionable whether this would add significantly to the debate; however, future work may want to consider this aspect. Unlike in Johnson4, differences between h-index and NRF rating were tested using a single factor ANOVA test. Similarities between engineers and biologists were tested using two-sample f-tests assuming equal variance. Pearson correlation was used to test whether h-index increased with time from date of rating.

 

Results

The latest NRF ratings (23 March 2021) included 385 rated individuals with 'Engineering sciences' as one of their primary disciplines. In the 2020 study4, a total of 644 rated scientists listed 'Biological sciences' as a primary discipline. In the latest ratings, 'Biological sciences' researchers are the most rated (18%), followed closely by 'Social sciences' also at 18%; the rest of the top five are 'Humanities' at 16%, 'Health sciences' at 12% and Engineering at 9%.

Figure 1 depicts ranges of ή-index for each of the three main NRF ratings for both biologists and engineers. It is clear that, similarly to biologists, h-indices for engineers fall into distinct ranges for each rating (F(2, 301) = 81, p<0.001). Johnson suggested ή-index norms for biologists were about 5-20 for C-ratings, 20-40 for B-ratings and >40 for A-ratings.4 These ή-index norms are shown in Figure 1 by green shading and the percentage of individuals falling within these norms is indicated above these bars. While the majority of engineers also fell within these norms, a number of A- and B-rated engineers fell below the suggested norms. Consequently, in all cases, the mean ή-index, per rating, was statistically lower for engineers than for biologists (p <0.05 in all three cases). This difference was 19 for A-ratings, 5 for B-ratings and 4 for C-ratings.

Johnson4 reported significant differences in ή-index, per rating, with time since rating. However, our analysis of biologists showed a weak relationship between h-index and time since rating, with r = 0.2 for A-ratings, r = 0.04 for B-ratings and r = 0.2 for C-ratings. For engineers, the relationship between h-index, per rating, and time since rating was not significantly better, with r = 0.4 for A-ratings, r = 0.2 for B-ratings and r = 0.2 for C-ratings.

 

Discussion

A striking result was how few engineers were rated. Ascertaining whether this result reflects vastly different numbers of individuals in these fields is difficult. Estimating this from the number of graduates in each field is problematic. Firstly, disciplines are not similarly captured, and, secondly, scientists, whether studying humans or nature, are far more likely to do research compared to engineers who typically enter applied engineering fields. Considering the number of academics in each field is also problematic as scientific research is often not university-bound. Engineering faculties also focus on producing engineers and academics may not feel an equal need to create knowledge compared to those in more pure fields. While a more detailed study could be undertaken, we hope this discrepancy is not due to higher student to academic ratios in engineering, or fewer research-active engineers, or engineering research being inferior, but that it reflects fewer engineers applying for rating. This may change particularly as university promotion criteria increasingly prioritise research and academics seek to demonstrate their worth.

The low number of rated engineers raises the question, Are engineers missing out? NRF ratings are a means of allocating public resources to research and engineers may well be missing out. However, engineering research being more applied, is more likely to obtain private funding as the commercial utility is clearer. Consequently, the question we rather focus on is: Are we missing the point? Returning to Polanyi, scientific worth is a function of three factors, which vary with domain: exactitude, systematic importance and intrinsic interest of the subject matter.1 While engineering may be deficient in the first two factors, it adequately makes up with 'intrinsic interest', especially for the layperson. Nevertheless, engineering researchers need to be kept in check, so we now consider whether an NRF rating or an Λ-index is the correct measure.

A key driver of engineering research is developing methods for practising engineers to solve problems.6 While dissemination of these methods is done in journals, dissemination is more often via conferences, standards, guidelines and trade magazines (so-called grey literature). While tracking grey literature has been attempted79, doing so is largely impractical. Tracking engineering research within real-world engineering is almost impossible as outputs (i.e. designs, reports and drawings) are 'invisible' to the public and rarely use standardised referencing conventions. Research-active engineers may therefore feel it is impossible to show research impact within the guidelines suggested for rating. The strong correlation between NRF ratings and /Hndices will likely reinforce this perception.

As alluded to in the previous paragraph, the users of engineering research ought to be practising engineers. There is a growing body of work that shows that journal papers are a poor means of knowledge transfer between academia and the applied world, particularly within engineering.10,11 Practising engineers are also contributing less to journals - either by way of original research or by discussion -compared to decades past.12 Consulting co-workers and supervisors is favoured13, with some suggesting that it is only as problem complexity increases that journal articles are considered14. However, it is more likely that textbooks are referred to than original research papers. Increasingly, engineers are also turning to social media for information.15 Fraser et al.10 contend, and show within Australian engineers, that this lack of trust in journals is due to a perceived research-practice gap. That is, research by academics is perceived to have little or no practical relevance. Fraser et al.10 argue that the measure of engineering research impact should be the practical use of research. Interestingly, medical practitioners are shown to have a much higher view of journal publications10, suggesting medical researchers are much better at articulating the practical significance of their work.

Research-active engineers seemingly face a dilemma of demonstrating the worth of their research. Much engineering research ends up in grey literature, which is hard to index and track. It is also questionable whether high ^-indices reflect practising engineers using cited research, given evidence that practising engineers rarely consider journals and the 'invisible' nature of engineering outputs. NRF ratings, in principle, seem the better metric as they take a more holistic view of engineers, but guidance documents emphasise peer-reviewed journal articles and in practice show close agreement to bibliometric patterns present in other disciplines.

Engineers strive to be pragmatic and propose solutions to all problems. In our view, research-active engineers should be able to demonstrate a close relationship with practising engineers. They should show how they are actively approached by industry to solve complex problems and show how they have solved these problems. Solving problems is fundamentally what research is all about.16 Complex problems are rarely tackled alone and teams of peers contribute and critique each other along the way. Engineers should demonstrate how they are striving to bring about knowledge transfer, not simply through traditional media (e.g. trade magazines, courses and discussion) but also through digital platforms. Altmetrics are increasingly able to track research impact across digital platforms.17 Peer-reviewed journal articles remain a key way to formalise research, but impact needs to be measured through more than just citations. Citations in textbooks, guidelines and standards should receive emphasis. Industry recognition through invited lectures, prizes and fellowship to voluntary associations should also be heavily weighted as these provide evidence of peer review and standing. In our view, basing an engineer's research impact on a few well-cited papers is a simplification of what is entailed in engineering research.

 

Acknowledgements

Chris James helped shape views expressed in this paper and reviewed a draft.

Competing interests

We have no competing interests to declare.

 

References

1. Polanyi M. The tacit dimension. Chicago, IL: University of Chicago Press; 2009.         [ Links ]

2. Hirsch JE. An index to quantify an individual's scientific research output. Proc Natl Acad Sci USA. 2005;102(46):16569-16572. https://doi.org/10.1073/pnas.0507655102        [ Links ]

3. South African National Research Foundation. Key research areas and types of research outputs [document on the Internet]. c2020 [cited 2021 Sep 01]. Available from: https://www.nrf.ac.za/sites/default/files/documents/11_Key%20Research% 20Areas%20and%20Types%20of%20Research%20 Outputs_Rating%20Call%202021_Sept%202020.pdf        [ Links ]

4. Johnson SD. Peer review versus the Λ-index for evaluation of individual researchers in the biological sciences. S Afr J Sci. 2020;116(9/10), Art. #8700. https://doi.org/10.17159/sajs.2020/8700        [ Links ]

5. Fedderke JW. The objectivity of national research foundation peer review in South Africa assessed against bibliometric indexes. Scientometrics. 2013;97(2):177-206. https://doi.org/10.1007/s11192-013-0981-0        [ Links ]

6. Poulos HG. Simplicity - a desirable end-point of geotechnical research. Ground Engineering. 1982;15(7).         [ Links ]

7. Cooper K, Marsolek W, Riegelman A, Farrell S, Kelly J. Grey literature: Use, creation, and citation habits of faculty researchers across disciplines. J Librariansh Inf Sci. 2019;7(1), eP2314. https://doi.org/10.7710/2162-3309.2314        [ Links ]

8. Musser L. Preserving the digital record of science and engineering: The challenge of new forms of grey literature. Issues Sci Technol Librariansh. 2016;83. https://doi.org/10.5062/F4251G69        [ Links ]

9. Stephen McMinn H, Fleming K. Tracking the use of engineering conference papers: Citation influence of the Stapp Car Crash Conference. Collection Building. 2011;30(2):76-85. https://doi.org/0.1108/01604951111127443        [ Links ]

10. Fraser K, Tseng T-LB, Deng X. The ongoing education of engineering practitioners: How do they perceive the usefulness of academic research? Eur J Eng Educ. 2018;43(6):860-878. https://doi.org/10.1080/03043797.2018.1450847        [ Links ]

11. Fothergill A. Knowledge transfer between researchers and practitioners. Nat Hazards Rev. 2000;1(2):91-98. https://doi.org/10.1061/(ASCE)1527-6988(2000)1:2(91)        [ Links ]

12. James CS. Editorial. Proc Inst Civil Eng-Wat Manag. 2010;163(5):217-218. https://doi.org/10.1680/wama.2010.163.5.217        [ Links ]

13. Anderson CJ, Glassman M, McAfee R, Pinelli T. An investigation of factors affecting how engineers and scientists seek information. J Eng Technol Manag. 2001;18(2):131-155. https://doi.org/10.1016/S0923-4748(01)00032-7        [ Links ]

14. Leckie GJ, Pettigrew KE, Sylvain C. Modeling the information seeking of professionals: A general model derived from research on engineers, health care professionals, and lawyers. Libr Q. 1996;66(2):161-193. https://www.jstor.org/stable/4309109        [ Links ]

15. Murphy G, Salomone S. Using social media to facilitate knowledge transfer in complex engineering environments: A primer for educators. Eur J Eng Educ. 2013;38(1):70-84. https://doi.org/10.1080/03043797.2012.742871        [ Links ]

16. Cross H. Engineers and ivory towers. New York: McGraw-Hill; 1952.         [ Links ]

17. Wikipedia. Altmetrics [webpage on the Internet]. No date [cited 2021 Sep 01]. Available from: https://en.wikipedia.org/wiki/Altmetrics        [ Links ]

 

 

Correspondence:
Charles MacRobert
Email: macrobert@sun.ac.za

PUBLISHED: 27 January 2022

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons