SciELO - Scientific Electronic Library Online

 
vol.15 número4Supportive framework for teaching practice of student nurse educators: An open distance electronic learning (ODEL) contextMentors' and student nurses' experiences of the clinical competence assessment tool índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • En proceso de indezaciónSimilares en Google

Compartir


African Journal of Health Professions Education

versión On-line ISSN 2078-5127

Afr. J. Health Prof. Educ. (Online) vol.15 no.4 Pretoria dic. 2023

http://dx.doi.org/10.7196/AJHPE.2023.v15i4.1670 

RESEARCH

 

The use of the Delphi technique as part of the process of developing a tool to evaluate physiotherapy clinical education programmes

 

 

V NaidooI; A StewartI; D MalekaII

IPhD. Department of Physiotherapy, School of Therapeutic Sciences, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, South Africa
IIPhD. Department of Physiotherapy, Sefako Makgatho Health Sciences University, Pretoria, South Africa

Correspondence

 

 


ABSTRACT

Objectively evaluating the quality and effectiveness of a physiotherapy clinical education programme has been unsuccessful to date, due to its complexity and the lack of a standardised tool. We undertook to develop a standardised programme evaluation tool and used the Delphi method to obtain consensus (set at 80%) to determine the face and content validity of the items and domains of the tool, the scoring system and a name for the tool. Academics, clinical physiotherapists and clinical physiotherapy educators participated in the Delphi rounds. Three Delphi rounds ensued: in Delphi round 1, a 71% response rate was obtained and 49 questions obtained 80% consensus. In Delphi round 2, a 91% response rate was obtained and 59 questions obtained 80% consensus. In Delphi round 3, a 42% response rate was obtained, several names were suggested, and the scoring system was established. The provisional tool of 85 items ultimately emerged as the Vaneshveri Naidoo Clinical Programme Evaluation Tool (VN-CPET).


 

 

World Physiotherapy[1] advocates that physiotherapy undergraduate or entry-level education includes training at a university-level higher education institute for a minimum of 4 years and that the programmes should be independently validated and accredited, ensuring full legislative and professional recognition. Physiotherapy education is underpinned by a strong clinical education component, where hands-on learning takes place under supervision in a clinical milieu.[2-5]

The physiotherapy undergraduate programme in South Africa (SA) is a 4-year degree programme and successful completion results in a Bachelor's degree. There are 8 universities in SA that offer a physiotherapy degree programme, which is regularly reviewed by peers under the auspices of the Health Professions Council of SA.[6] The clinical phase, with a prescribed minimum of 1 000 hours of clinical practice, involves contextual learning, where students develop, apply, enhance and integrate technical and non-technical skills learnt on real patients/clients in various settings: hospitals (provincial, district, tertiary, quaternary, private and specialised units), primary healthcare clinics, patients' homes and schools (including special needs schools). Clinical education is therefore the unique, pragmatic learning of the undergraduate physiotherapy student within a community of practice.[7] It is central to the curriculum and the value of clinical education is undisputed by all scholars.[3,8-11]

Peer review evaluation is an important subjective, but incomplete, quality assurance process. The structure and processes of clinical education are not fully reviewed in our current review system. Adding to this gap, is the inability to define quality in clinical education.[2,5,12] A systematic review by McCallum et al.[3] shows that the evidence to define quality and best practice in physiotherapy clinical education remains inconclusive. High-quality clinical education requires comprehensive monitoring and evaluation, using a reliable and validated tool.[13]

Given that clinical education is a core component of a physiotherapy undergraduate curriculum enabling the development of a competent professional physiotherapist, it is concerning that there is no tool that evaluates the clinical education component of such a curriculum. It is not known whether the goals, objectives and outcomes of a clinical education programme are being achieved, nor are its limitations known. Therefore, currently, the clinical education component of a physiotherapy undergraduate curriculum cannot be objectively and independently evaluated.

We thus undertook to develop a physiotherapy clinical evaluation tool to evaluate all components of undergraduate physiotherapy clinical education, as identified and prioritised by the research participants. This process involved several consecutive steps over three phases.

In phase 1 of the larger study, 14 focus group discussions (FGDs) were held with all stakeholders involved in physiotherapy clinical education: academics, clinical physiotherapists (including new graduates) and clinical physiotherapy educators (the participants are described in this article). The transcribed FGDs underwent inductive thematic analysis, which yielded three themes (governance; academic structure; and operational structure) and 131 items (the preliminary tool). These items were then used in phase 2 of the study (reported in this article) to determine the face and content validity of the preliminary tool, using the Delphi technique.

The Delphi technique is a method used to obtain consensus about non-formalised knowledge of the profession from experts.[14] Experts are defined as informed individuals who are knowledgeable about the topic under discussion.[14,15] Also, there is no standard way to assess consensus. [10,14] Three Delphi rounds are generally recommended.[14,15] The first Delphi round comprises unstructured, open-ended questions.[151 This round is followed by more specific questions in the second/subsequent rounds, using a 4- or 7-point Likert scale.[14,15] However, variation to the said Delphi technique has been reported, and is referred to as the modified Delphi technique.[14,15] The Delphi technique enables academic expertise to be combined with practitioners' perspectives and experiences, and was the method used to obtain consensus in this phase of our study.

Therefore, the aim of phase 2 of our study was to establish the face and content validity of the preliminary tool to evaluate a physiotherapy clinical education programme, develop a scoring system and suggest a name for the tool.

 

Method

Purposive sampling included key stakeholders involved in physiotherapy education. The participants comprised academics from 7 of 8 SA universities that offer physiotherapy degree programmes, clinical physiotherapists and physiotherapy clinical educators. All participants had also been involved in phase 1 of the development of the tool, i.e. the FGDs, all of which were facilitated by the first author (VN).

Procedure

Fig. 1 outlines the procedure of the study.

Participants signed informed consent prior to taking part in the study.

The invitation to participate, information letter and preliminary tool of 131 questions were emailed and sent via WhatsApp to all participants who had participated in phase 1 of the study (the FGDs). They were given 3 weeks to respond to each of the Delphi rounds. They had the opportunity to comment on the wording of the items in all the Delphi rounds. Three Delphi rounds ensued, using a modified approach.[7]

In the first Delphi round, 81 participants were invited to take part. They were requested to consider three options: (i) which questions must be included (must include); (ii) questions that could possibly be included (possibly include); and (iii) questions that must be excluded (exclude). Three reminder emails and WhatsApp messages were sent to all participants until the return date was reached. After each Delphi round the first two authors (VN and AS) reviewed the comments.

In Delphi round 2, the 56 participants who had taken part in round 1 were emailed the second draft of the tool. Eighty-one items remained in this draft. The participants were requested to make a binary decision regarding which questions must be included or excluded from the tool; hence, only two options were available: include or exclude. This second draft did not include the questions that obtained 80% consensus in phase 1. Only the questions obtaining <80% consensus were included in Delphi round 2.

The aim of Delphi round 3 was to determine a scoring system for each question and to suggest a name for the tool. Sixty-four questions were emailed to all 79 participants. This was done to counteract participant fatigue[14,16] and because the information required in this round was not dependant on the first two rounds.

The participants were presented with the questions and the scoring options. They had to comment on the proposed scoring options either by agreeing or disagreeing. If they disagreed with the scoring, the participants were requested to suggest an alternative scoring preference.

Frequencies/percentages were used to analyse the Delphi rounds - set at 80% consensus.[17,18]

Ethical approval

Ethical approval for the study was obtained from the Human Research Ethics Committee, University of the Witwatersrand, Johannesburg (ref. no. M210160).

 

Results

This study established the face and content validity of the preliminary tool from items that were generated by the FGDs in phase 1 of our study, a scoring method and a name for the tool.

Delphi rounds

Two email addresses were invalid and therefore 79 invitations were distributed in round 1. Fifty-six participants (participation rate 71%) returned the questionnaire; 49 questions obtained 80% consensus (Table 1) and remained in the tool: 8 questions in section 1 (governance), 23 questions in section 2A (academic structure) and 18 questions in section 2B (operational structure).

Table 1 also outlines the items that obtained 80% consensus in Delphi round 2. Fifty-one participants (participation rate 91%) returned the questionnaire, and 59 questions obtained 80% consensus in this round.

Thirty-three participants (participation rate 42%) responded in Delphi round 3 (Table 1). Appendix 1 (https://www.samedical.org/file/2105) is an example of the suggested scoring from the different sections (governance; academic structure; operational structure) of the tool circulated in Delphi round 3. The feedback received from the participants was reviewed and is summarised in Appendix 2 (https://www.samedical.org/file/2105). Also, the scoring options were then reviewed and modified by the authors if required, as per participant feedback; they did not provide an answer to the question but imposed a self-assessed judgement of the answer provided by the person completing the tool.

The completion of the Delphi rounds resulted in 85 questions (Fig. 1), a name for the tool (Vaneshveri Naidoo Clinical Programme Evaluation Tool (VN-CPET)) and a scoring system, which is a self-evaluative system (Appendix 1: https://www.samedical.org/file/2105).

 

Discussion

The Delphi methodology is well known in health research and is widely used to obtain input or consensus from groups of experts where there is either doubt or a dearth of evidence.[14,15,17,19] We used the Delphi process to obtain consensus on items to be included in our programme evaluation tool, as it allowed us to cover a substantial group of participants from various geographical locations.[14] Furthermore, it enabled participant anonymity; iteration with controlled, structured feedback; expert input; and statistical aggregation of group responses.[14,17] The most common definition of consensus is percent agreement, with 75% agreement typically used.[17] Trevelyan and Robinson[14] assert that consensus must not be confused with agreement or stability of response. They state that consensus measures the extent to which participants agree with one another, while agreement measures the extent to which participants agree with the statement under consideration, and stability measures internal reliability. [14] Three Delphi rounds have been recommended to prevent participant fatigue, with an a priori criterion.[14,18] Our posteriori consensus was set at 80%[19] over three rounds. Participant fatigue was noticed in round 3, with a response rate of 42% (n=80), compared with 71% (n=80) and 91% (n=56) in rounds 1 and 2, respectively. Throughout the data collection process in this phase of the study, the tool was scanned for redundancy and eloquence: redundant questions were removed, questions were rephrased and a few questions were split into separate questions, as per participants feedback in the Delphi rounds.

The Delphi process permitted key stakeholders involved in physiotherapy clinical education to identify components required to govern (governance) a clinical education programme, which is the legislative framework of policies and procedures that guide the programme.[12,20] Academic processes refer to the teaching and learning strategies implemented by the academic institute to ensure clinical readiness. It also incorporates quality assurance measures to ensure high-quality clinical education and overall monitoring and evaluation of the programme. Operational structure refers to the processes (e.g. clinical orientation; clinical supervision) that occur at the clinical site to facilitate the clinical teaching and learning of students.

The complexity of a physiotherapy clinical education programme is emphasised in this tool by the wide range of influences and the important elements of physiotherapy clinical education that this tool has captured. The strengths and weaknesses of a physiotherapy clinical education programme can now be easily and objectively identified, using this comprehensive, standardised tool (VN-CPET). The rigorous mixed methods process, and the Delphi method specifically, was a key process used to identify the pertinent components of this tool.

 

Conclusion

One hundred and thirty-one items were reduced to 85 items following consensus of items using the Delphi methodology. The tool was named the VN-CPET, and a self- evaluative scoring system unfolded through the Delphi process. It has captured a wide range of influences that contribute to the complexity of a physiotherapy clinical education programme, which can now be objectively evaluated, using a standardised tool (VN-CPET).

Key messages

What is already known on this topic

Scholars have struggled to measure the effectiveness and quality of a physiotherapy clinical education programme due to the diversity and complexity of such a programme and the paucity of research in this area. Suggestions were made regarding what constitutes quality in clinical education; however, no consensus was reached. Therefore, the quality and effectiveness of a physiotherapy clinical education programme could not be evaluated.

What this study adds

Our study has developed a validated and standardised monitoring and evaluation and quality assurance tool that can be used to evaluate length, breadth and depth, i.e. the multidimensional nature of a physiotherapy clinical education programme. Furthermore, it aligns with the CIPP framework (context, input, process and product),[21] which appears to be the gold standard evaluative framework for clinical education. It can also be used to determine the local responsiveness of a clinical education programme and the international competitiveness; with some modification it can be used to compare, compete and benchmark with international physiotherapy clinical education programmes and to evaluate other health science courses.

Study limitations

There is subjectivity in the scoring component of the tool. A high attrition rate was noted in Delphi round 3. The context of our study is limited to physiotherapy in SA.

Declaration. None.

Acknowledgements. Sincere thanks to the study participants and the funders of this study.

Author contributions. All authors contributed equally to the conceptualisation and writing of the manuscript.

Funding. The National Research Fund (NRF), SA Society of Physiotherapy (SASP) and Wits Faculty Research Committee (FRC).

Conflicts of interest. None.

 

References

1. World Physiotherapy. Education Policy Statement, 2020. https://world.physio/ (accessed 3 June 2020).         [ Links ]

2. Jette DU, Nelson L, Palaima L, Wetherbee E. How do we improve quality in clinical education? Examination of structures, processes, and outcomes. J Phys Ther Educ 2014;28:6-12. https://doi.org/10.1097/00001416-201400001-00004        [ Links ]

3. McCallum CA, Mosher PD, Jacobson PJ, Gallivan SP, Giuffre SM. Quality in physical therapist clinical education: A systematic review. Phys Ther 2013;93(10):1298-1311. https://doi.org/10.2522/ptj.20120410        [ Links ]

4. Meyer IS, Louw A, Ernstzen D. Physiotherapy students' perceptions of the dual role of the clinical educator as mentor and assessor: Influence on the teaching-learning relationship. S Afr J Physiother 2017;73(1):1-9. https://doi.org/10.4102/sajp.v73i1.349        [ Links ]

5. Strohschein J, Hagler P, May L. Assessing the need for change in clinical education practices. Phys Ther 2002;82(2):160-172. https://doi.org/10.1093/ptj/82.2.160        [ Links ]

6. Health Professions Council of South Africa. Professional boards: physiotherapy, podiatry and biokinetics. https://www.hpcsa.co.za/ (accessed 13 August 2020).         [ Links ]

7. Wenger E. Communities of Practice and Social Learning Systems: The Career of a Concept in Social Learning Systems and Communities of Practice. London: Springer, 2010:179-198.         [ Links ]

8. Baldry Currens JA, Bithell CP. Clinical education: Listening to different perspectives. Physiotherapy 2000;86(12):645-653. https://doi.org/10.1016/S0031-9406(05)61302-8        [ Links ]

9. Chetty V, Maddocks S, Cobbing S, et al. Physiotherapy clinical education at a South African university. Afr J Health Professions Educ 2018;10(1):13. https://doi.org/10.7196/ajhpe.2018.v10i1.987        [ Links ]

10. Delany C, Bragge P. A study of physiotherapy students' and clinical educators' perceptions of learning and teaching. Med Teach 2009;31(9):e402-e411. https://doi.org/10.1080/01421590902832970        [ Links ]

11. Higgs J. Managing clinical education: The programme. Physiotherapy 1993;79(4):239-246. https://doi.org/10.1016/S0031-9406(10)60705-5        [ Links ]

12. Stachura K, Garven F, Reed M. Quality assurance: Measuring the quality of clinical education provision. Physiotherapy 2000;86(3):117-126. https://doi.org/10.1016/S0031-9406(05)61154-6        [ Links ]

13. Eseryel D. Approaches to evaluation and training: Theory and practice. J Educ Technol Soc 2002;5(2):93-98.         [ Links ]

14. Trevelyan EG, Robinson N. Delphi methodology in health research: How to do it? Eur J Integr Med 2015;7(4):423-428. https://doi.org/10.1016/j.eujim.2015.07.002        [ Links ]

15. Powell C. The Delphi technique: Myths and realities. J Adv Nurs 2003;41(4):376-382. https://doi.org/10.1046/j.1365-2648.2003.02537.x        [ Links ]

16. Hsu C, Sandford B. The Delphi technique: Making sense of consensus. Pract Assess Res Eval 2007;12(10):1-7. https://doi.org/10.7275/pdz9-th90        [ Links ]

17. Diamond IR, Grant RC, Feldman BM, et al. Defining consensus: A systematic review recommends methodologic criteria for reporting of Delphi studies. J Clin Epidemiol 2014;67(4):401-409. https://doi.org/10.1016/j.jclinepi.2013.12.002        [ Links ]

18. Maleka DM, Stewart AV, Hale LA. The development of a community reintegration outcome measure to assess people with stroke living in low socioeconomic areas. J Disabil Rehabil 2017;3:11-24. https://doi.org/10.5348/D05-2017-26-OA-2        [ Links ]

19. Okoli C, Pawlowski SD. The Delphi method as a research tool: An example, design considerations and applications. Inform Manage 2004:42(1):15-29. https://doi.org/10.1016/j.im.2003.11.002        [ Links ]

20. Bigdeli M, Rouffy B, Downs Lane B, et al. Health systems governance: The missing links. BMJ Global Health 2020;5(8):1-5. https://doi.org/10.1136/bmjgh-2020-002533        [ Links ]

21. Stufflebeam DL. International Handbook of Educational Evaluation. Dordrecht: Kluwer Academic Publishers, 2003:31-62.         [ Links ]

 

 

Correspondence:
V Naidoo
vaneshveri.naidoo@wits.ac.za

Accepted 19 July 2023

Creative Commons License Todo el contenido de esta revista, excepto dónde está identificado, está bajo una Licencia Creative Commons