SciELO - Scientific Electronic Library Online

 
vol.15 número3Nursing and midwifery students will be left behindTutorials to support learning: Experiences of nursing students in a competency-based nursing programme índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


African Journal of Health Professions Education

versão On-line ISSN 2078-5127

Afr. J. Health Prof. Educ. (Online) vol.15 no.3 Pretoria Set. 2023

http://dx.doi.org/10.7196/AJHPE.2023.v15i3.890 

RESEARCH

 

Surgical videos used for face-to-face and virtual oral assessment: Experiences of examiners and trainees

 

 

K J BaatjesI; W ConradieII; J M EdgeII; E ArcherIII

IPhD (Surgery); Division of Surgery, Department of Surgical Sciences, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
IIMMed (Surgery); Division of Surgery, Department of Surgical Sciences, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa
IIIPhD (
ΗΡΕ) Simulation and Clinical Skills Unit, Centre for Health Professions Education, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, South Africa

Correspondence

 

 


ABSTRACT

BACKGROUND. Varied assessment strategies are required in the process of deciding whether a surgeon is competent to graduate. Despite doubts about the reliability of the oral examination and the challenges of standardising examiner practices, the oral examination remains an important assessment method in surgical exit examinations. Structured oral examinations may facilitate the measurement of course outcomes.
OBJECTIVES. To explore the experiences of surgical trainees and examiners using a video-assisted, mock, structured oral examination (SOE) as an assessment tool.
METHODS. This descriptive study incorporated procedural videos in a case-based SOE format. One group of trainees had face-to-face contact with the examiner, and the other was assessed on an online platform, e.g. Microsoft Teams, with a remote examiner. After the SOE, a focus group interview was conducted with surgical trainees and individual interviews with examiners.
RESULTS. Themes were developed from the interview transcripts. These themes centred around the use of videos in this examination format and technical issues during the SOE. Further themes highlighted the standardisation of questions and preparation of examiners.
CONCLUSION. Overall, procedural videos as part of the mock SOE were experienced as valuable. Adding video recordings to the online platform posed administrative and technical challenges. However, the trainees and the examiners could log in from peripheral clinical training sites, which was experienced as an advantage. This study provides a glimpse into the application of procedural videos during SOEs as an assessment tool from the perspective of surgical trainees and examiners. Efforts should focus on standardisation of the examination format, optimising technical issues and improving examiner preparation.


 

 

Multiple and diverse skills are required for a surgeon to be competent. It is critical that surgical trainee programmes not only create opportunities for the trainees to acquire these competencies, but also provide the assurance that the competencies are assessed validly and reliably. The oral examination (viva) is a standard assessment method to test knowledge, insight and clinical reasoning.[1] There are concerns regarding its reliability and validity, which may result in bias, variation in content and degree of difficulty. Efforts to provide structure to oral examinations may aid in measuring achievement of the course outcomes in a reproducible manner among candidates,[1,2] but require examination tools that are acceptable to both the examiner and the candidate.

The Colleges of Medicine of South Africa (www.cmsa.co.za) is the national examining body for medical professions in South Africa (SA). The fellowship examination for the College of Surgeons of SA currently consists of two 3-hour written papers comprising multiple-choice questions (MCQs). One examination assesses aspects of general surgery and surgical pathology, and the other anatomy and operative technique. These written examinations are followed by an oral assessment that includes clinical cases and an objective structured clinical examination (OSCE). Assessing the competence of a surgeon is complex. The College of Surgeons of SA has been considering different means of assessing work-based competence - this study explores a novel method. Surgical educators at our institute continuously strive to incorporate innovative pedagogical formats for teaching and assessment in surgical training.

Mock assessments can assist trainees in preparing for final examinations. This helps the student to practise their examination technique and facilitates valuable feedback, allowing trainees to measure their performance, development and progress in preparation for the exit examination.[3] Educational programmes from different disciplines have successfully implemented mock oral examinations to simulate the exit examinations.[4-7] The structured oral examination (SOE) is based on a clinical case with predefined questions and goals.[1] It improves the reliability and validity of the assessment[8] by limiting the subjectivity bias associated with traditional unstructured oral examinations.

Surgical educators should be encouraged to lead innovative pedagogical formats for teaching and assessment in surgical training. Video recordings, as part of a clinical scenario, can complement the SOE method by testing the depth of knowledge and clinical application using visual cues. This adjunct to clinical assessment is structured and reproducible, and can be applied remotely with the use of virtual platforms. This method would be particularly valuable in broad geographical regions, such as southern Africa, where longdistance travel to examination venues adds to cost. The COVID-19 pandemic afforded opportunities for innovation in the virtual space of assessment, which allows for mock oral examinations conducted across regional areas.[6] Endoscopic surgery videos have been used for evaluation and educational purposes, but there have been few studies that have used videos of open surgery. Moore et al.[9] used a GoPro to record open operations, making use of the footage to give feedback to surgical trainees.

The current study aimed to explore the experiences of surgical trainees and examiners using video-assisted mock SOEs, both in a face-to-face and online format. Evaluating the addition of video recordings in an SOE assessment may provide insights into broader aspects of postgraduate surgical education.

 

Method

This descriptive study was performed in the Division of Surgery, Stellenbosch University (SU) and Tygerberg Academic Hospital, Cape Town, SA. The mock oral assessment took a case-based format using procedural videos in SOEs. These videos were selected based on the examiners' preferences, with some using their own previously recorded procedures and others using recordings available online. Informal instructions were provided to examiners, requesting that all candidates be asked pre-set standard questions about the recorded operative procedures. As each examiner had videos with different operative procedures, all questions were unique. However, the focus was generally on the relevant anatomy, operative technique and hypothetical complications. An example of questioning used was: 'Which nerve is being identified at this point in the procedure? If you cannot find it at this location, where else can you look for it?'

The SOE was to be paused at the same video time frames for each candidate to allow for questions and discussion. No marks were allocated, and only verbal feedback was provided.

All doctors enrolled in the surgical programme as trainees, and specialist surgeons who were registered as examiners at SU were invited to participate in the study. The principal researcher (KJB) introduced the study at a weekly departmental meeting. This introduction was followed by a detailed email comprising the study aims and an invitation to join as a research participant. All participants were invited to a voluntary mock case-based oral assessment using procedural videos, with a follow-up interview to gauge their experiences in their respective roles of trainee and surgeon examiner.

Convenience sampling occurred, as all those who responded to the invitation were included in the study. Participants were assigned to two groups of trainees and two groups of surgeon examiners, similar in size. One group of trainees had face-to-face contact with the examiner for the mock case-based oral assessment using procedural video assessment, and the other group was tested on the online platform, Microsoft Teams (Microsoft Corp., USA), for the mock case-based SOE using procedural videos. The examiners and the trainees had variable levels of experience. To minimise bias, they were assigned to examination rooms as they arrived at the event.

Ethical approval was granted by SU's Health Research Ethics Committee (ref. no. N20/09/090), and informed consent was obtained from all participants.

Data collection

Demographic information collected from trainees included the 'number of years since qualification as a doctor' and 'year of training in surgery'. For the examiners, the 'number of years since qualification as a specialist' and their 'experience as assessors' were documented. After the mock assessment, one focus group interview was held with the trainees to explore their experiences of the assessment process. Eight individual face-to-face interviews were conducted with the examiners, based on their availability. These interviews were performed by a research team member, independent of the Division of Surgery. The interviews and focus group sessions were conducted in English, recorded and then transcribed by an independent transcriber.

Data analysis

Qualitative data were generated using individual interviews and focus groups. The transcribed interviews were analysed following the six-phase process by Braun and Clarke.[10] The research team worked together to reach a consensus on the codes and themes developed from the data.

 

Results

Nine of the 21 consultants (43%) and 15 of the 29 trainees (52%) agreed to participate in the study. On the day of the SOE, several of them were called to clinical duties, and the final participants comprised 8 consultants and 12 trainees (38% and 41%, respectively). They were divided into two groups. Nine trainees (75%) had been in the 5-year training programme for >3 years.

The following themes were developed: (i) usefulness of videos in examining and teaching; (ii) technical issues; (iii) standardisation; and (iv) preparation of examiners before the examination.

Theme 1: Usefulness of videos in examining and teaching

The examiners and the trainees were positive about using procedural videos as part of the mock oral assessments. Feedback on the advantages, as well as limitations, is summarised (Table 1).

 

 

Theme 2: Technical issues

All the interviewees provided comments on the technical aspect involved in the assessment, with examples (Table 2). It was clear that the online platform with the use of video recordings posed additional administrative and technical challenges. An unexpected finding was the difficulty for trainees to orientate the anatomy within a video procedure when starting a procedure midway through or when zoomed into the operative field.

 

 

Theme 3: Standardisation

Examiners commented more on this topic than the trainees (Table 3). Oral examinations are often perceived as being unfair, and there was a general perception that the use of videos resulted in increased standardisation of the assessment process, especially when an assessment rubric was used.

 

 

Theme 4: Preparation of examiners before the examination

The examiners' comments relating to their preparation for the SOE using procedural videos elucidated the need for attention to this aspect. The trainees were not aware of all the 'behind-the-scene' arrangements, and therefore they did not comment on this theme. The examiner's views are highlighted in Table 4.

 

 

Discussion

The assessment of surgical trainees should test multiple competencies and encompass instruments that can assess the range of surgical knowledge and insight. Despite reservations about its lack of reliability and the challenges of standardising examiner practices, the oral examination remains an important assessment method in the surgical postgraduate context.[11] This study explored the experiences and perspectives of surgical trainees and examiners of the video-assisted mock SOE as a tool to assess knowledge and clinical reasoning.

The fellowship examination of the College of Surgeons of SA is a composite of assessments. It consists of written and oral examinations with clinical cases and an OSCE in face-to-face and virtual platforms, which include enquiry on procedures and operative technique. This study simulated both examination platform formats and evaluated the addition of video recordings in an SOE as part of a clinical scenario. This adjunct to clinical assessment is more structured and reproducible and can assess clinical applications using visual prompts. Potential barriers to implementing this additional assessment method (i.e. the use of video recordings) may include access to suitable clinical scenarios and experienced examiners. The impact on the continuation of clinical service during assessments, screen fatigue by users, and network and streaming problems may also limit use.[7]

In the study, examiners and trainees experienced procedural videos as part of the mock oral assessments as valuable. The trainees further viewed the videos as an opportunity to practise answering procedural questions in preparation for real examination circumstances. Trainees requested to have similar sessions serially throughout their 5 years of training. They found the feedback after the SOE useful to benchmark their development and performance, demonstrating a willingness to take responsibility for their learning.[12]

Technical issues related to connectivity on the online platform and audiovisual disturbances during the video recordings were encountered. However, the benefit of the online platform was that candidates and examiners could log in remotely from different sites, a key feature in our wide physical training area. There were several issues regarding the quality of the video material. For example, the trainees noted that it was difficult to orientate the anatomy within a video procedure when the operation was not started at the beginning and/or zoomed into the field. The importance of creating scientifically accurate videos of high quality as part of video libraries has been argued by other authors in the field[13] and is a significant consideration going forward.

The issue of standardisation of the process and questions posed was raised by all the examiners. They felt that adding consistency to the questions and having a standard assessment rubric might lessen the perception of unfair questioning. Other comments emphasised the importance of having appropriate procedural material and following standard approaches to the oral examination. The importance of thorough examiner training before examinations has been emphasised by Preusche et al.[14] and was similarly noted by our examiners, who felt that training and guidelines would clarify expectations and examination practice.

The small sample size, single execution site and once-off nature of the assessment are some of the limitations of this study. Furthermore, the fact that the examiners are also the trainers and were known personally to the trainees might also have been a hindrance during the assessment. Some of these examiners were also required to assess a procedure they performed infrequently. The main strength of the article is the contribution to the body of evidence that relates to the use of videos during oral assessments. Using videos in oral assessments can support the reliability of such evaluations.

Future explorations should include the experience of a larger cohort of examiners and trainees, with assessment across multiple national examination centres. Insights on standardisation of video-assisted SOEs for trainees and examiners require emphasis on preparation and technical aspects.

 

Conclusion

Continuous review and adaptation of education modalities and assessment formats are essential in surgical education. This study provides a promising glimpse into the use and application of procedural videos during SOEs as an assessment tool from the perspective of surgical trainees and examiners.

Efforts should now concentrate on the standardisation of the examination format, optimising technical issues and improving examiner preparation.

Declaration. None.

Acknowledgements. We thank the participants for their time and the patients who allowed their procedures to be recorded.

Author contributions. In preparation of the manuscript, all authors contributed to the design of the research project, but KJB led the process. EA did the interviews. All the authors were part of the data analysis process. In terms of write-up, KJB wrote the introduction and background, EA wrote the methodology and most of the findings and all authors worked on the discussion and conclusion, after KJB completed the first version of the latter two sections. WC and JME contributed to the final preparation of the document. All authors read and approved the manuscript.

Funding. This research was supported by the Faculty of Surgical Trainers and Association for the study of Medical Education Research Grants Project No. FST/ASME/20/006.

Conflicts of interest. None.

 

References

1. Davis MH, Karunathilake I. The place of the oral examination in today's assessment systems. Med Teach 2005;27(4):294-297. https://doi.org/10.1080/01421590500126437        [ Links ]

2. Imran M, Doshi C, Kharadi D. Structured and unstructured viva voce assessment: A double-blind, randomised, comparative evaluation of medical students. Int J Health Sci (Qassim) 2019;13(2):3-9.         [ Links ]

3. Bose MM, Gijselaers WH. Why supervisors should promote feedback-seeking behaviour in medical residency Med Teach 2013;35(11). https://doi.org/10.3109/0142159X.2013.803059        [ Links ]

4. Balsam LB. Zoom into the future with the virtual mock oral examination. JTCVS Open 2020;3:138-139. https://doi.org/10.1016/j.xjon.2020.08.007        [ Links ]

5. Chaurasia AR, Page BR, Walker AJ, et al. Lessons to learn from a successful virtual mock oral examination pilot experience. Adv Radiat Oncol 2021;6(1):100534. https://doi.org/10.1016/j.adro.2020.07.011        [ Links ]

6. Goodman JF, Saini P, Straughan AJ, Badger CD, Thakkar P, Zapanta PE. The virtual mock oral examination: A multi-institutional study of resident and faculty receptiveness. OTO Open 2021;5(1):2473974X2199739. https://doi.org/10.1177/2473974X21997392        [ Links ]

7. Zemela MS, Malgor RD, Smith BK, Smeds MR. Feasibility and acceptability of virtual mock oral examinations for senior vascular surgery trainees and implications for the certifying exam. Ann Vasc Surg 2021;76:28-37. https://doi.org/10.1016/j.avsg.2021.03.005        [ Links ]

8. Memon MA, Joughin GR, Memon B. Oral assessment and postgraduate medical examinations: Establishing conditions for validity, reliability and fairness. Adv Health Sci Educ 2010;15(2):277-289. https://doi.org/10.1007/s10459-008-9111-9        [ Links ]

9. Moore MD, Abelson JS, O'Mahoney P, Bagautdinov I, Yeo H, Watkins AC. Using GoPro to give video-assisted operative feedback for surgery residents: A feasibility and utility assessment. J Surg Educ 2018;75(2):497-502. https://doi.org/10.1016/j.jsurg.2017.07.024        [ Links ]

10. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77-101. https://doi.org/10.1191/1478088706qp063oa        [ Links ]

11. Darzi A, Mackay S. Assessment of surgical competence. Qual Saf Health Care 2001;10(Suppl 2):ii64-ii69. https://doi.org/10.1136/qhc.0100064        [ Links ]

12. Wang L, Khalaf AT, Lei D, et al. Structured oral examination as an effective assessment tool in lab-based physiology learning sessions. Adv Physiol Educ 2020;44(3):453-458. https://doi.org/10.1152/advan.00059.2020        [ Links ]

13. Mota P, Carvalho N, Carvalho-Dias E, João Costa M, Correia-Pinto J, Lima E. Video-based surgical learning Improving trainee education and preparation for surgery. J Surg Educ 2018;75(3):828-835. https://doi.org/10.1016/j.jsurg.2017.09.027        [ Links ]

14. Preusche I, Schmidts M, Wagner-Menghin M. Twelve tips for designing and implementing a structured rater training in OSCEs. Med Teach 2012;34(5):368-372. https://doi.org/10.3109/0142159X.2012.652705        [ Links ]

 

 

Correspondence:
K J Baatjes
kbaatjes@sun.ac.za

Accepted 17 May 2023

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons