SciELO - Scientific Electronic Library Online

 
vol.57 número4The role of colonic stents in 2010Vacuum-assisted closure of the open abdomen in a resource-limited setting índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Journal of Surgery

versão On-line ISSN 2078-5151
versão impressa ISSN 0038-2361

S. Afr. j. surg. vol.57 no.4 Cape Town Dez. 2019

http://dx.doi.org/10.17159/2078-5151/2019/v57n4a3045 

GENERAL SURGERY

 

An audit of the outcomes of the College of Surgeons general surgery final examinations

 

 

M KahnI; D KahnI; J KlopperI; P NavsariaI, II

IDepartment of Surgery, Groote Schuur Hospital, University of Cape Town, Cape Town, South Africa
IITrauma Centre, Groote Schuur Hospital, University of Cape Town, Cape Town, South Africa

Correspondence

 

 


ABSTRACT

BACKGROUND: An audit of the Fellowship of the College of Surgeons (FCS) of South Africa examination results has not been previously performed. The purpose of this study was to review and determine any predictors of outcome (pass or fail)
METHODS: The results of the FCS(SA) final examinations from October 2005 to and including October 2014, were retrieved from the College of Medicine of South Africa database. The current format of the examinations consists of two written essay question papers, an objectively structured clinical examination (OSCE), two clinical cases and two oral examinations. These were retrospectively reviewed and analysed. Predictors of failure or success were determined.
RESULTS: During the 10-year study period, 472 candidates attempted the examinations. A total of 388 (82%) candidates were successful in the written component of the examination and were subsequently invited to participate in the clinical component of the examinations. Overall, 296 (63%) candidates passed and 176 (37%) failed. There were 51 candidates who were invited to the oral examinations despite an average of less than 50% in the two papers, and 34 (67%) failed the overall examination. Similarly, 126 candidates were invited having failed one of the two papers of which 81 (64%) ultimately failed. A total of 49 candidates failed the OSCE, 82% of these candidates failed overall. There were strong correlations between the averages of the papers versus the orals (Spearman
ρ = 0.51), the papers versus the cases (Spearman ρ = 0.50), and the papers versus the OSCE (Spearman ρ = 0.55).
CONCLUSION: The written papers are the main determinant of invitation to the second part of the examination. Candidates with marginal scores in the written component had an overall failure rate of 67%. Failing one paper and passing the other, resulted in an overall failure rate of 64%. Failing the OSCE resulted in an overall 82% failure rate. With the high failure rate of candidates with marginal scores and with the inter-examination variability of the papers, it might be prudent to revisit both the process of invitation selection and the decision to continue with the long-form of the written component.

Keywords: College of Surgeons, final examinations, outcomes


 

 

Introduction

The manner in which both undergraduate and postgraduate medical graduates are assessed has been subject to extensive scrutiny and review in recent years.1-5 The traditional format of examination with written essay-type papers, clinical cases and oral viva voce examinations have been criticised by educationalists and found to be significantly flawed.6,7 These formats of examination are unfortunately extremely subjective and lack the psychometric properties of a good examination such as reliability, validity, applicability and acceptability.

The Colleges of Medicine of South Africa (CMSA)

has been appointed by the Health Professions Council of South Africa (HPCSA) to be the sole examining body for postgraduate medical specialists. The College of Surgeons of South Africa (CSSA) is one of the Colleges within the CMSA and responsible for the assessment of surgical trainees.

Graduates need to pass the FCS(SA) final examination of the CSSA to be awarded the Fellowship of College of Surgeons (FCS) in order to practice independently as a specialist in general surgery. In this study, we undertook an analysis of the success and failure rates in the final fellowship examination and attempted to determine predictors of either outcome.8

 

Methods

The results for all components of the FCS(SA) final examination were made available by the CMSA for the period 2005-2014. The component results included individual percentage scores for two written papers, two oral examinations, two clinical cases and an OSCE. Invitees to the clinical component of the examination were indicated by the presence of results for these components.

Analysis was performed using IPython for scientific computing. Assumptions for the normal distribution of numerical values were made based on the Shapiro-Wilk test and quantile-quantile plots. Normally distributed variables were analysed by parametric tests. In all other cases nonparametric tests were employed. An alpha value of 0.05 was chosen to indicate statistical significance, using a confidence level of 95%.

 

Results

Data point values were available for a total of 19 examinations for the period July 2005-October 2014. This comprised 472 instances of candidates taking the examination (some candidates attempted the examination more than once). (Table 1)

 

 

The number of candidates writing remained fairly constant each year at approximately 40 per year, with the exception of 2010 and 2011 when there were almost double the usual number (n = 76 and Π = 79, respectively) of candidates. The College of Surgeons examinations take place in May and October each year. The number of candidates in May and October were usually similar, except in 2008, 2009 and 2011 when there were significantly more candidates writing in October, and in 2007 when there were more candidates in May. A total of 296 (63%) of the 472 candidates successfully completed the examination and 176 (37%) failed. The average pass rate remained fairly constant at 60% (range 45-76%) each year. A total of 388 (82%) candidates were successful in the written component of the examination and were subsequently invited to participate in the oral/clinical component of the examinations (Table 2). The proportion of candidates invited to participate in the oral/clinical component of the examinations each year has remained constant at 80% (range 66-100%).

The median mark achieved in Paper I was 55% (IQR 12%). The median mark achieved in Paper I each year has remained constant during the study period 50% (IQR 10.25%) to 58% (IQR 13%). The median mark achieved in Paper II each year has also remained constant during the study period: 50% (IQR 13%) to 57.5% (IQR 11%).

The median mark in the General Surgery/Surgical Pathology Oral (Oral I) was 55% (IQR 15%) and remained very constant each year during the study period ranging from 55% (IQR 15%) to 60% (IQR 20%). The median mark in the Surgical Anatomy/Operative Surgery Oral (Oral II) was 55% (IQR 15%) and remained very constant each year during the study period ranging from 55% (IQR 14.25%) to 60% (IQR 20%).

The marks achieved in the OSCE examination are shown in Figure 1. The median mark in the OSCE was 60% (IQR 13%) and remained constant each year during the study period ranging from 55.5% (IQR 15.75%) to 65% (IQR 10.5%).

The median mark for the Clinical Cases was 58.75% (IQR 12.5%). The marks achieved for the Clinical Cases each year also remained constant during the study period ranging from 53.75% (IQR 9.4%) to 62.5% (IQR 12.5%).

The College regulations stipulate that candidates who achieve less than 50% in the papers may also be invited to the orals/clinical if the combined mark in the two papers is greater than 50%. Seventy-four candidates who achieved less than 50% in Paper I were invited to the orals and 41 (55%) failed the overall examination. A total of 56 candidates who achieved less than 50% in Paper II were invited to the orals and 41 (64%) failed the overall examination. There were 51 candidates who were invited to the oral examinations despite an average of less than 50% in the two papers, and 34 (67%) failed the overall examination. Similarly, 126 candidates were invited having failed one of the two papers of which 81 (64.3%) ultimately failed. A total of 49 candidates failed the OSCE, 40 (82%) of these candidates failed overall.

We undertook a comparison of the marks achieved in the various components of the examination. There was a positive correlation between the marks achieved in Paper I and Paper II (Spearman ρ = 0.57). Similarly, there was a positive correlation between the marks achieved in Oral I and Oral II (Spearman ρ = 0.39). There was also a positive correlation between the marks awarded in the Clinical Cases i.e. case I versus case II (Spearman ρ = 0.37). There was a positive correlation between the marks achieved in the papers and orals (Spearman ρ = 0.51). There was also a positive correlation between the average marks achieved in the papers and the average marks achieved in the clinical cases (Spearman ρ = 0.50). Similarly, there was a positive correlation between the average marks for the papers and the average marks in the OSCE examination (Spearman ρ = 0.55).

 

Discussion

There was a total of 472 attempts spanning 19 examinations. The period 2010 and 2011 saw a significant number of entries (76 and 79, respectively). This number has seen a decline since, with only 45 candidates entering in 2014. With an overall pass rate of 63% this tendency of decline is also seen in the actual number of individuals qualifying as surgeons, with only 25 doing so in 2014.

Of the 472 attempts at the examination by completing two long-form papers, 388 (82.2%) were invited to the second part of the examination comprising two oral exams, the clinical cases and an OSCE. The invitations are based on performance in the written papers and the rate of invitations were quite variable (minimum 56% in 2006, maximum 100% in 2012). Of those invited, the minimum score for Paper I was 40%, for Paper II was 45% and the minimum for the average of the papers was 46%. The maximum for those not invited for Paper I was 58%, for Paper II 60% and for the average of the two papers was 54%.

The decision to invite candidates with low scores for their papers resulted in a high failure rate. Twenty-eight candidates failed both papers but scored more than 45% and were still invited. Twenty-four (86%) of these ultimately failed. There were 51 candidates who were invited to the oral examinations despite an average of less than 50% in the two papers, and 34 (67%) failed the overall examination. Similarly, 126 candidates were invited having failed one of the two papers of which 81 (64%) ultimately failed. A total of 49 candidates failed the OSCE, 82% of these candidates failed overall. There was a statistically significant difference between the 19 examinations for both papers, the first case and the OSCE. This was more so for the two papers, suggesting inconsistency in the difficulty of the papers. This is out of keeping with the results for the orals and case two, further suggesting that lack of consistency in the papers.

 

Study limitations

It was not possible to determine the number of repeat candidates for each examination since all entrants are assigned a unique number for each examination. Similarly, demographic data such as gender and racial classification were considered confidential and not supplied by the CMSA.

 

Conclusions

The written papers are the main determinant of invitation to the second part of the examination. Candidates with marginal scores in the written component had an overall failure rate of 67%. Failing one paper and passing the other resulted in an overall failure rate 64%. Failing the OSCE resulted in an overall 82% failure rate. With the high failure rate of candidates with marginal scores and with the inter-examination variability of the papers, it might be prudent to revisit both the process of invitation selection and the decision to continue with the long-form for the written component.

 

Recommendations

There is a major shortage of surgical providers in South Africa, especially in the public sector.9 There are significant disparities in the number and distribution of general surgeons in South Africa. There is one hospital per 100 000 population. There are 186 hospital beds, 41 surgical beds, 1.7 specialist general surgeons, 2.9 non-specialist general surgeons, and 3.6 operating theatres per 100 000 population in South Africa. These numbers fall far below international recommendations, as well as for developed countries such as the UK and USA. There are 6 specialist general surgeons per 100 000 insured population working in the private sector, which is comparable with the United States (US). South Africa produces around 50 general surgeons per year, and there has, over the last 5 years, been an increase in the college examination pass rate from 60 to 80%. In comparison with the South Africa College of Surgeons, the Australian College had a 97% pass rate resulting in a larger number of surgical registrars or residents qualifying each year. For the current ratio of 1.78 surgeons per 100 000 to remain the same by 2030, this will result in a deficit of 352 surgeons. If a ratio of 5 general surgeons per 100 000 is to be achieved in the next 15 years, a further 2 600 specialist surgeons need to be trained. South Africa is currently producing an average of 50 surgeons annually. If the surgeon output were to increase to 150 per year, South Africa could achieve these numbers in roughly 17 years. The surgical training programmes need to treble their output of specialists, or this goal will remain beyond reach.

The number of trainees qualifying as surgeons in South Africa are not meeting the needs of the country. An assessment of this requirement is absolutely necessary. Furthermore, strategies should be developed to increase trainee posts.

Declaration of conflict of interest

The authors declare no conflict of interest.

Ethics approval

UCT HREC No. 840 / 2016

Orcid

Ρ Navsaria http://orcid.org/0000-0002-5152-3317

 

REFERENCES

1. Van der Vleuten CPM. The assessment of professional competence: developments, research, and practical implications. Adv Health Sci Educ. 1996;1:41-67.         [ Links ]

2. Evgenios E. Assessment methods in surgical training in the United Kingdom. J Educ Eval Health. 2013;10:6-7.         [ Links ]

3. Wass V, Van der Vleutan, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357:945-9. PMID: 11289364        [ Links ]

4. Norcini J, McKinley D. Assessment methods in medical education. Teaching and teacher education. 2007;23:239-250. doi:10.1016/j.tate.2006.12.021        [ Links ]

5. Al-Wardy NM. Assessment methods in undergraduate medical education. SQU Med J. 2010;10:203-9. PMID: 21509230        [ Links ]

6. Sloan DA, Donnelly MB, Schwartz RW, Felts JL, Blue AV, Strodel WE. The use of objective structured clinical examination (OSCE) for evaluation and instruction in graduate medical education. J Surg Res. 1996; 63:225-30. PubMed PMID: 8661202        [ Links ]

7. Martin JA, Regehr G, Reznick R, MacRae H, Murnaghan J, Hutchison C, Brown M. Objective structured assessment of technical skill (OSATS) for surgical residents. Br J Surg. 1997;84:273-8. PubMed PMID: 9052454        [ Links ]

8. CMSA. 2018. Guidelines for the FCS(SA) final examinations [ONLINE]. Available from: https://www.cmsa.co.za [Accessed January 2018].         [ Links ]

9. Dell A. 2016. Global surgery - socioeconomic and geographic maldistribution of surgical resources. University of Cape Town. http://hdl.handle.net/11427/22796        [ Links ]

 

 

Correspondence:
Pradeep HNavsaria
pradeep.navsaria@uct.ac.za

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons