SciELO - Scientific Electronic Library Online

 
vol.14 número4Supportive framework for teaching practice of student nurse educators: An open distance electronic learning (ODEL) contextMentors' and student nurses' experiences of the clinical competence assessment tool índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • En proceso de indezaciónSimilares en Google

Compartir


African Journal of Health Professions Education

versión On-line ISSN 2078-5127

Afr. J. Health Prof. Educ. (Online) vol.14 no.4 Pretoria dic. 2022

http://dx.doi.org/10.7196/AJHPE.2022.v14i4.1594 

RESEARCH

 

Reporting quality of Master of Medicine (MMed) mini-dissertations using the STROBE checklist

 

 

E S Grossman

PhD East London and Port Elizabeth Health Resource Centres, Faculty of Health Sciences, Walter Sisulu University, Mthatha, South Africa

Correspondence

 

 


ABSTRACT

BACKGROUND. The 2011 Health Professions Council of South Africa (HPCSA) directive to make a research component compulsory for specialist registration has been decried in some circles as encouraging low-quality research.
OBJECTIVE. To assess the reporting quality of South African (SA) MMed mini-dissertations using the STROBE checklist.
METHODS. A total of 100 monograph-format mini-dissertations reporting retrospective observational research were extracted from a pool of 335 mini-dissertations. Analysis of each was undertaken using a 24-point STROBE Statement checklist. Scoring was as follows: 1 = the item was compliant with STROBE recommendations; 0.5 = partially described; and 0 = not addressed at all. Satisfactory compliance was set at 66%, thus a STROBE score of 17-24 was considered satisfactory. Data were entered into an Excel spread sheet and analysed descriptively.
RESULTS. STROBE item compliance for individual mini-dissertations was at a mean of 83.1%; range 50-97%; median 85% and mode 89%. Sixteen mini-dissertations were non-compliant, scoring below 17 as per the set threshold of 66%. This indicates an 84% satisfactory sample. Only Item 5 (Key settings and locations) was at 100% compliance. The four lowest scores were for STROBE items (9) Bias (29.5%); (10) Study size/power analysis (52%); (1) Title (69%) and (14) Missing data (69%).
CONCLUSION. The majority of sampled mini-dissertations, evaluated as per STROBE recommendations, are transparently reported to allow the reader to follow what was planned, done, found and which conclusions were drawn. As such the results confer a measure of reporting quality on the SA MMed research endeavour. The use of dissertation templates, commonly using STROBE-type headings and prompts, might have contributed to the good scores obtained. Importantly, areas of weakness in the writing of the SA MMed mini-dissertations have been highlighted and show which items require attention.


 

 

The 2011 HPCSA directive to make a research component compulsory for specialist registration came at a critical time in the SA higher education environment. Since 1994, a cascade of policy-driven changes linked to the country's transformation, research and economic strategy, has resulted in pressures on universities to do more with less. Such pressures have had a knock-on effect on postgraduate training in general, which clinical specialist research training has not escaped.

Recent data[1] have shown that between 2010 and 2016, the number of health instructional and research staff declined from 2 901 to 2 485, translating to a loss of 500 full-time equivalent researchers. In terms of research capacity, the health sciences have the lowest proportion of staff with PhDs. Paradoxically, Faculty of Health Sciences (FHS) postgraduate numbers have risen, with a doubling of health sciences PhDs and a 60% increase in MSc graduates in the same time period.[1] The above has contributed towards stretching both FHS research resources and supervisory capacity.

When exploring the registrar research niche within the academic milieu, MMed research and research supervision was at a low base in 2011. For historical reasons, the FHS suffered a two-decade backlog of clinical researchers,[2] which could have contributed to a less than 10% MMed completion rate experienced prior to 2010.[3] By 2016, research active MMed/MDent candidature rose to 50% of the FHS postgraduate cohort at a research intensive university (S Benn, personal comm, March 2016) and MMed graduate numbers escalated from 359 in 2010 to 642 in 2018 (J Mouton, personal comm, June 2020). Graduate numbers are set to soar even higher with the latest HPCSA requirement now specifying an examined and passed MMed or a published accredited journal article, rather than the previous loosely worded 'research component' as the required research piece.[4]

What impact, then, do the decline in FHS research staff and ever-rising postgraduate numbers have on the MMed candidate embarking on the research component?

When implemented, neither the Colleges of Medicine of South Africa (CMSA) nor the HPCSA made any allowance for universities being unequally resourced to meet the 2011 MMed research requirement.[5] Aldous et al.[6] argue that when meeting the requirements of the HPCSA MMed mandate, three of the difficulties training universities encounter are: acquiring research supervisors who fulfil regulatory stipulations; providing sufficient research supervision time; and allocating ringfenced time for specialist trainees to do research and attend methods courses. Elsewhere, the local FHS postgraduate research supervision culture is described as being 'haphazard, impersonal, pressurized and mechanistic'.[7] Lastly, registrars felt that MMed research performance would benefit from improved research training and support.[8] Pertinent to the latter are the special needs unique to the MMed postgraduate which are often overlooked when providing research support: registrars are novice researchers; each candidate requires a clinically-based research project; they are time-poor with erratic windows of research opportunity; they have an overstretched clinical load and must deal with study and examination commitments during their specialist training. Given the challenges that the FHS currently face in terms of research capacity, it is unsurprising that some clinical academics allege that the introduction of a research requirement for specialist registration is counterproductive: leading to poorly conducted research, the teaching of inferior science,[9] polluting the research pool and low-quality research which undermines research reliability.[10] While these allegations appear emotive, they merit deeper investigation, bearing in mind the dominant policy discourses taking place in many jurisdictions across the globe on the pressures of massification on the quality of higher education.

Research 'quality' is an elusive notion and not easily captured in any quantitative metric, with the gold standard to assess quality in research being peer review.[1] While conversion of an MMed-type dissertation into a publication is widely considered as a sign of research quality in countries as diverse as France,[11] Egypt,[12] Peru,[13] and Turkey[14] there is ample evidence to show that poor quality is not the only reason for accredited journals to reject a research submission. Reasons for manuscript rejection are numerous: the topic is not of interest to the research readership; the journal has recently published on the topic; the submission is unlikely to make a significant contribution to knowledge; a negative result; reviewer and editor bias or subjectivity; conflicts of interests; the significance of the research is not yet apparent; whether the research field is 'hot' or not; and geographical bias, among others.[15-17] Hence, publication as a validation of rigorous research is not a realistic option when assessing the worth of an MMed research project.

Fortunately, another measure of research quality is available based on the way evidence is presented in the research study. The raison d'être of this measure is that the merit and potential impact of a scientific study can only be judged if the reader can determine exactly how the study was conducted and what was found. To this end, several reporting guidelines have been developed, depending on the study design used, and consolidated under the umbrella of the EQUATOR network (Enhancing the QUAlity and Transparency Of health Research), to improve transparency and accuracy of communicating medical research.[18] STROBE (STrengthening the Reporting of OBservational studies in Epidemiology) was developed in 2004 and is used as a guideline for reporting observational studies, specifically cohort, case-control and cross-sectional studies.[19] While the guidelines were developed primarily to evaluate journal publications,[20] STROBE has been used to assess the quality of Indian postgraduate medical degrees,[21] Chinese public health dissertations[22] and the methods section of Indian dental postgraduate dissertations.[23-

The aim of this investigation was to assess the reporting quality of SA MMed mini-dissertations using recommended STROBE guidelines to support or refute previous claims of 'shoddy' research.[9,10] This current investigation is based on two previous studies[21,22] which provide a validated method, and a suitable comparison of results. To obtain sample uniformity, all selected mini-dissertations were limited to monograph format and were retrospective in focus. The latter requirement was set to overcome any confounders linked to cost.[12]

 

Methods

The data source for this current study consisted of 335 MMed minidissertations, collected for previous use.[4,24,25] Briefly, all mini-dissertations were downloaded from local and global electronic theses and dissertation databases (www.netd.ac.za; www.ndltd.org) and library repositories of the eight SA specialist training universities. From this pool, monograph-format mini-dissertations reporting observational research of retrospective study design were extracted, resulting in 100 mini-dissertations for analysis (Fig. 1). Dissertations were assessed using the STROBE Statement checklist of 22 items,[26] which, after extensive piloting, underwent minor adjustment for data collection as follows:

The title and abstract (STROBE 1) were separated to give a score for each of the two recommended items.

Variables and outcomes were split in STROBE 7 to isolate the two recommendations.

STROBE 14 was divided to give one score each for describing participants and mentioning missing data.

STROBE 22 on funding was excluded as being redundant to the MMed university degree.

 

 

Hence, no departure from STROBE statements occurred. Rather, three items were separated to independently record important subcomponents within the mini-dissertation. Scoring was as follows: 1 = the item was compliant with STROBE recommendations; 0.5 = described partly and 0 = not addressed at all. Thus, 24 was the highest score a dissertation could obtain and 0 the lowest. Scoring consistency was ensured by closely following a guide which offers a detailed, exampled explanation for each STROBE item during assessment.[27] There was no penalty if an item did not appear in the precise location or order as reflected in the checklist[20,27] and no piloted data used in the final analysis.

Satisfactory compliance was set at 66%, as done elsewhere.[21] Therefore a STROBE score of 17-24 was considered a satisfactory indicator of minidissertation reporting quality, below 17 was unsatisfactory.

Data analysis

The data were entered into an Excel spread sheet (Microsoft, USA) and analysed descriptively, and 20% of the monographs were reassessed at the end of the study for intra-rater reliability. The Kappa value obtained was at 0.74, indicating scoring consistency.

Ethical approval

All mini-dissertations are in the public domain. Ethics approval for the study was obtained from the Walter Sisulu University, Faculty of Health Sciences, Postgraduate Education, Training, Research and Ethics Unit: Human Research Committee Clearance Certificate (ref. no. 32/2019).

 

Results

Breakdown of the analysed 100 mini-dissertations as per university and college are in Table 1. Unsurprisingly, mini-dissertations from the four largest Health Sciences Faculties predominate, as does the College of Obstetrics and Gynaecology, which required a research component for specialisation prior to the HPCSA decree. Fig. 2 shows the spread of the sampled mini-dissertations according to year of acceptance.

 

 

Sixteen mini-dissertations were non-compliant with a total score below 17 as per the set threshold of 66%, which translates to an 84% satisfactory sample. When analysing total item scores (Table 2), compliance was at a mean of 83.1%; range 50-97%; median 85% and mode 89%. Table 3 shows comments and scoring for the nine STROBE items which fell below the compliance mean of 83.1%. In addition, two further findings regarding STROBE items 5, 7 and 15 are noteworthy. All mini-dissertations provided detailed information of settings, locations and time periods of data collection (Item 5) and was the only item to score 100% for the sample.

Item 7 (Outcomes) scored just below the mean at 82% in the Methods section. This rose to 90% when Outcomes was further enlarged upon in the Results (Item 15), indicating a deeper understanding of this research concept as the study progressed.

 

Discussion

The mean compliance to individual STROBE items in the sampled SA MMed monographs was 83.1%. This is in contrast to 27.37%[21] and 74.79% [22] for Indian postgraduate medical students and Chinese Master of Public Health dissertations, respectively (Table 2). Although reporting outcomes between the current study and the other three publications are not entirely equivalent, comparisons can be made. While SA registrars are uniquely non-compliant with Items 1 (Title) and 3 (Objectives) and exceptionally compliant with Item 5 (Settings and locations), poor compliance in Items 10, 14, 17 and 19 of the current study can be found in one or more of the other three studies. The results suggest that SA registrars report observational studies that clearly present what was planned, done and found in the study. However, it must be borne in mind that while clarity of reporting is a prerequisite to research evaluation, the STROBE checklist is not an instrument to evaluate the quality of observational research as such,[20] nor was it designed to assess unpublished work such as the mini-dissertation.[21,22]

That said, the aim of the current study was to ascertain whether the allegations of 'shoddy science' and similar claims[[9,10] are justified by using the STROBE checklist to assess the reporting quality of SA MMed mini-dissertations. With the sampled observational mini-dissertations being 84% compliant in important aspects of research reporting, our MMed registrars have clearly met the STROBE requirements of research transparency and reporting quality which could serve as a rejoinder to MMed research naysayers. However, in the interest of prudence, it is necessary to delve a little deeper into the registrar research learning environment to probe alternative causes for the high STROBE scores achieved.

Thesis and dissertation templates are extensively used, globally and across all academic disciplines, to apply structure, assist development and explore the research process through directed writing.[28] Templates follow Faculty regulations, guidelines and formats for the presentation of higher research degrees. They usually include mandatory administrative sections for the front matter; the body of the text with appropriate headings with the back matter covering references and appendices. By using prompts and sub-headings throughout, many of which meet EQUATOR principles, the FHS postgraduate is able, via the template, to apply structure to the research project, further develop the topic, explore research strategies and become acquainted with previously unknown research concepts and terms. Novice researchers, such as registrars, are often daunted by the prospect of 'doing the MMed' - they literally do not know where or how to start research in a meaningful way and are unfamiliar with research terminology. In this, the templates cater to an educational need by facilitating a streamlined learning process.[6] Thus, the question which requires answering is, 'Are the high STROBE scores obtained in this current study an indication of robust research learning on the part of registrars or of the particular clinical or university template, or indeed, the quality of the template itself?' This study cannot provide an answer to the question and suggests that further exploration on the matter is required.

But there remains a further proviso which must be considered when critically assessing the high STROBE marks obtained in this study against claims of 'poorly conducted research'[9,10] on the one hand and the use of templates on the other. This proviso falls within the ambient of hyper-structured student research projects (HSSRP), an alternative mode of research supervision. HSSRP have evolved, within a depleted supervisory pool, to manage large numbers of students and specific research agendas,[29] as is currently experienced by SA FHS. HSSRP take a utilitarian approach, placing institutional interests first by delivering on institutional requirements of efficient postgraduate throughput to ensure the success of a particular learning programme. Pertinently, defined report writing to produce an acceptable research report via templates is one of the hallmarks of HSSRP. However, HSSRP has raised questions about the graduateness and professional preparedness of the postgraduate as well as critical reflection of whether HSSRP meets the needs of academic and professional requirements. Indeed, some contend that such structured, controlled project management (or 'spoon-feeding') influences both the quality of researchers being trained and the value of the research produced.[30] Thus, the concerns of the MMed research naysayers[9,10] have merit and cannot be completely dismissed when considering their argument.

A previous Zambian study was prescient when suggesting that MMed dissertation quality would benefit from utilising reviewing guidelines under the EQUATOR umbrella at the planning stages of dissertation research studies.[31] The results of the current study have supported this opinion, thereby demonstrating the relevance and role of STROBE and other reporting guidelines to boost research reporting quality on the African continent. In addition, this study has underscored areas of weakness in the writing of the SA MMed mini-dissertation and provided direction for items which require attention. Further, interesting commonalities between the current study and the other three reports[21-23] have been highlighted. Finally, comparison of mean overall compliance scores imply that the reporting quality of the SA MMed mini-dissertation is on a par, if not better than elsewhere[21-23] despite being conducted in a research resource-challenged academic environment and stretched supervisory capacity.

Study limitations

The extraction of data and reporting appraisal was done by a single investigator.

Study strengths

This is the first report to examine the reporting quality of MMed minidissertations or equivalent outside of India and China.

 

Conclusion

This study has shown that SA MMed mini-dissertations reporting retrospective observational research in monograph format are 83.1% compliant when assessed with the STROBE Statement checklist. The results indicate that the sampled studies have been transparently reported to allow the reader to follow what was planned, done, found and which conclusions were drawn. As such the findings confer a measure of reporting quality on the SA MMed research endeavour, refuting the claims of shoddy research made elsewhere. However, just as reporting quality does not imply research quality in SA mini-dissertations so the sentiments of poor science, polluting the research pool and low-quality research cannot be held true of the MMed dissertation in the absence of evidence, examples or proof. The purpose of having reporting guidelines in medical research is to encourage total transparency, accurate reporting and easier assessment in the validity of reported research findings. Perhaps the time has come to make greater use of the EQUATOR armamentarium to ensure reporting quality when embarking on the MMed research study, rather than dismissing the compulsory HPCSA research component for specialist registration out of hand.

Declaration. None.

Acknowledgements. None.

Author contributions. Sole author.

Funding. None.

Conflicts of interest. None.

 

References

1. Mouton J, Basson I, Blanckenberg J, et al. The state of the South African research enterprise. DST-NRF Centre of Excellence in Scientometrics and Science, Technology and Innovation Policy. Stellenbosch University. 2019. http://www0.sun.ac.za/crest/wp-content/uploads/2019/08/state-of-the-South-African-research-enterprise.pdf        [ Links ]

2. Grossman ES, Naidoo S. Academic-service partnerships, research and the dental academic. J Dent Educ 2012;76(9):1226-1233. http://repository.uwc.ac.za/xmlui/handle/10566/1205        [ Links ]

3. Grossman ES, Cleaton-Jones PE. Pipelines or pipe dreams? PhD production in a South African dental research institute 1954-2006. PiE 2011;29(3):111-125. https://www.ajol.info/index.php/pie/article/view/76980        [ Links ]

4. Grossman ES. Publication rate of 309 Master of Medicine dissertations submitted between 1996-2017: Can our registrars fulfil HPCSA Form 57 MED amendments? S Afr Med J 2020;110(4):302-307. https://doi.org/10.7196/SAMJ.2020.v110i4.14339        [ Links ]

5. Szabo CP, Ramlall S. Research competency and specialist registration: Quo vadis? S Afr Med J 2016;106(12):1183-1185. https://doi.org/10.7196/samj.2016.v106i12.11217        [ Links ]

6. Aldous CM, Adhikari M, Rout CC. The research component of specialist registration - a question of alligators and swamps? A personal view. S Afr Med J 2015;105(1):21-22. https://doi.org.10.7196/SAMJ.8732        [ Links ]

7. Heyns T, Bresser P, Buys T, et al. Twelve tips for supervisors to move towards person centered research supervision in health care sciences, Med Teach 2019. https://doi.org/10.1080/0142159X.2018.1533241        [ Links ]

8. Kisansa ME, Lubinga MH. Strategic and focused solutions to challenges faced by medical postgraduate students in performing research at a South African university. S Afr J High Educ 2020;34(2):59-73. https://doi.org/10.20853/34-2-3666        [ Links ]

9. Biccard BM, Dyer RA, Swanevelder JL, et al. Is the HPCSA requirement for a research mini-dissertation for specialist registration the best option? South African J Anaesth Analg 2017;23(4):4-6.         [ Links ]

10. Rodseth RN, Wise R, Bishop D. Polluting the well. South African J Anaesth Analg 2017;23(6):5.         [ Links ]

11. Chassagnon G, Dangouloff-Ros V, Vilgrain V, et al. Academic productivity of French radiology residents: Where do we stand? Diagn Interv Imaging 2016;97(2):211-218. https://doi.org/10.1016/j.diii.2015.08.001        [ Links ]

12. Nour-Eldien H, Mansour NM, Abdulmajeed AA. Master's and doctoral theses in family medicine and their publication output, Suez Canal University, Egypt. J Family Med Prim Care 2015;4(2):162-167. https://doi.org/10.4103/2249-4863.154622        [ Links ]

13. Arriola-Quiroz I, Curioso WH, Cruz-Encarnacion M, et al. Characteristics and publication patterns of theses from a Peruvian medical school. Health Info Libr J 2010;27(2):148-154. https://doi.org/10.1111/j.1471-1842.2010.00878.x        [ Links ]

14. Koca K, Ekinci S, Akpancar S, et al. An analysis of orthopaedic theses in Turkey: Evidence levels and publication rates. Acta Orthop Traumatol Turc 2016;50(5):562-566. https://doi.org/10.1016/j.aott.2016.03.001        [ Links ]

15. Young SN. Bias in the research literature and conflict of interest: An issue for publishers, editors, reviewers and authors, and it is not just about the money. J Psychiatry Neurosci 2009;34(6):412-417.         [ Links ]

16. Fathelrahman AI. Rejection of Good Manuscripts: Possible Reasons, Consequences and Solutions. J Clinic Res Bioeth 2015;6(1):204. https://doi.org/10.4172/2155-9627.1000204        [ Links ]

17. International Science Council. Advisory Note on Bias in Science Publishing. https://council.science/publications/advisory-note-on-bias-in-science-publishing/ (accessed 17 October 2022).         [ Links ]

18. EQUATOR Network. UK Equator Centre, Centre for Statistics in Medicine, University of Oxford. http://equator-network.org (accessed 17 October 2022).         [ Links ]

19. STROBE Statement. http://www.strobe-statement.org/ (accessed 17 October 2022).         [ Links ]

20. von Elm E, Altman DG, Egger M, et al. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: Guidelines for reporting observational studies. PLoS Med 2007;4(10):1623-1627. https://doi.org/10.1371/journal.pmed.0040296        [ Links ]

21. Bhawalkar JS, Jadhav SL, Banerjee A, et al. Research trends in post-graduate medical students, Pune. Ann Med Health Sci Res 2014;4(3):355-360. https://doi.org/10.4103/2141-9248.133459.         [ Links ]

22. Dai Shuangyang, Xiaobin Zhou, Hong Xu, et al. Evaluation of the reporting quality of observational studies in master of public health dissertations in China. BMC Med Res Methodol 2020;20:230. https://doi.org/10.1186/s12874-020-01116-6        [ Links ]

23. Shirahatti RV, Hegde-Shetiya S. Review of research designs and statistical methods employed in dental postgraduate dissertations. Indian J Dent Res 2015;26(4):427-434.         [ Links ]

24. Grossman ES. How long does it take a registrar to complete the compulsory research project enabling specialist registration? S Afr Med J 2019;109(4):254-258. https://doi.org/10.7196/SAMJ.2019.v109i4.13377        [ Links ]

25. Grossman ES. Content analysis of the South African MMed mini-dissertation. Afr J Health Prof Educ 2020b:12(2);56-61. https://doi.org/10.7196/AJHPE.2020.v12i2.1227        [ Links ]

26. von Elm E, Altman DG, Egger M, et al. STROBE Statement Checklist. https://www.strobe-statement.org/fileadmin/Strobe/uploads/checklists/STROBE_checklist_v4_combined.pdf (accessed 17 October 2022).         [ Links ]

27. Vandenbroucke JP, von Elm E, Altman DG, et al. Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): Explanation and Elaboration. PLoS Med 2007;4(10):1628-1654. https://doi.org/10.1371/journal.pmed.0040297        [ Links ]

28. Singh S. An intervention to assist students with writing their dissertations and theses. S Afr J High Educ 2011;25(5):1020-1030. https://www.ajol.info/index.php/sajhe/article/view/80701        [ Links ]

29. Steyn R. Ethical dilemmas associated with hyper-structured student research projects. S Afr J High Educ 2020;34(1):231-248. https://doi.org/10.20853/34-1-3095        [ Links ]

30. Boehe DM. Supervisory styles: a contingency framework. Stud High Educ 2016;41(3):399-414. https://doi.org/10.1080/03075079.2014.927853        [ Links ]

31. Ahmed Y, Kanyengo CW, Akakandelwa A. Mapping postgraduate research at the University of Zambia: a review of dissertations for the Master of Medicine programme. Med J Zambia 2010;37(2):52-57. https://www.ajol.info/index.php/mjz/article/view/75655        [ Links ]

 

 

Correspondence:
E S Grossman
grossmane@gmail.com

Accepted 18 January 2022

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons