SciELO - Scientific Electronic Library Online

 
vol.14 issue3Developing nurse educators' research capacity in a resource-constrained environmentDemystifying sexual connotations: A model for facilitating the teaching of intimate care to nursing students in South Africa author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


African Journal of Health Professions Education

On-line version ISSN 2078-5127

Afr. J. Health Prof. Educ. (Online) vol.14 n.3 Pretoria Sep. 2022

http://dx.doi.org/10.7196/AJHPE.2022.v14i3.1530 

RESEARCH

 

From implementation to revising simulation integration into undergraduate physiotherapy training

 

 

A van der MerweI; R Y BarnesI; M J LabuschagneII

IMSc (Physiotherapy), PhD (Physiotherapy); Department of Physiotherapy, Faculty of Health Sciences, University of the Free State, Bloemfontein, South Africa
IIMB ChB, MMed (Ophthalmology), PhD (HPE) Clinical Skills Unit, Faculty of Health Sciences, University of the Free State, Bloemfontein, South Africa

Correspondence

 

 


ABSTRACT

BACKGROUND. Careful consideration of an increasingly underprepared tertiary student population, the limited use of simulation in South African (SA) healthcare education and a changing healthcare education milieu is required from SA educators when implementing, evaluating and revising simulation integration.
OBJECTIVES. To develop a conceptual framework for the integration of simulation in the SA undergraduate physiotherapy programme.
METHODS. A non-experimental descriptive research design was used. A purposive sample of 15 healthcare educationalists from SA and abroad were approached to participate in a modified Delphi survey, informed by the results obtained from a systematic review identifying simulation integration framework elements. Data were analysed as percentages, with feedback provided to panel members following each round.
RESULTS. Data saturation was achieved after round 3, with a response rate of 73.3% (n=11). The main findings suggested that student preparation prior to simulation-based learning experiences (SBLEs) should include orientation to SBLE logistics and expectations (73%), and could include informal assessment of theory (64%). Inclusion of the feedback/debriefing process (82%), methods (100%) and timing (85%) as part of student and educator preparation were also deemed essential. Panel members agreed that programme evaluation in line with stakeholder feedback (92%) is vital for guiding adjustments to the programme that is integrating simulation.
CONCLUSION. The developed conceptual framework indicates the necessity of student and educator preparation to ensure optimal SBLE participation and outcome achievement. Programme sustainability should be ensured through programme evaluation and adjustment, in line with stakeholder feedback, best practice and accrediting professional body guidelines.


 

 

The current secondary education system in South Africa (SA) seems to fall short in preparing students adequately for tertiary education, resulting in students who struggle with basic language and comprehension skills.[1,2] Matters are further complicated by healthcare students exhibiting a variety of learning qualities, preferences and styles,[3] requiring a multimodal approach to teaching and learning. The value of simulation as a teaching strategy lies in the ability of the method to address the various learning needs and experience levels of the current student population, making learning a contextualised and interactive process that provides hands-on, student-centred education in a more realistic environment.[4,5]

If simulation is to be optimally implemented, educator competence is vital. Competence refers to the educator's ability to perform his or her role in the selection and use of simulation modalities[6] in line with best practice, proficiency in the art of debriefing[5,7] and constructive feedback.[8] As simulation-based education (SBE) is still evolving in sub-Saharan African healthcare education, SA healthcare educators are expected to fulfil the majority of, if not all, roles related to the implementation of simulation-based learning experience (SBLE), owing to the lack of human resources.[9,10] Additionally, SA healthcare educators view the lack of trained educators as a barrier to the integration of simulation,!'1 which includes limited training in debriefing methods. In accordance with best practice, constructive feedback and debriefing following engagement in a SBLE requires a trained facilitator, who is able to ensure optimal implementation of the educational method and, most importantly, the facilitation of learning.[5,9] In the SA healthcare education setting, where the use of simulation is still in its infancy,[101 educator and facilitator competency with regard to the implementation of simulation should be ensured, and this requires attention.

Taking into account the state of the current secondary schooling system and its apparent failure to produce independently thinking and reasoning school leavers,[11] coupled with increased access to tertiary education,[1] the reality is that not all undergraduate students might be prepared for training in a simulated environment. Cognitive load theory, which is based on cognitive learning theory, guides simulation integration by aiming to structure the volume, complexity and design of a SBLE according to students' experience level, thereby preventing cognitive overload, which is not conducive to learning.[7] The detrimental effects of students being underprepared for the simulated environment has been reported by Welman and Spies,[12] and therefore facilitators are expected to prepare students prior to participation in a SBLE for optimal learning.

The Delphi survey in the present study aimed to obtain expert consensus regarding elements to be included in a conceptual framework for the integration of simulation in the SA undergraduate physiotherapy programme. Secondly, the study aimed to develop the conceptual framework. The planning theme of the framework has been published previously.[13] For the purpose of this article, the implementation, evaluation and revision themes are explored in detail.

 

Methods

Design

A descriptive research design using a modified Delphi survey was used. Statements obtained from a preceding systematic review were included in the questionnaire, to which experts were required to respond on a three-point Likert scale with options 'essential', 'useful' and 'not applicable'.

Sampling and participants

Fifteen national and international healthcare educationalists in physiotherapy and/or other healthcare fields, as well as healthcare simulation experts, were purposively sampled. In order to provide a more contextualised view of SA's unique environment and educational challenges, the majority of panel members were South Africans.

Data collection

Panel members were informed of the study aim and procedure. To increase content validity, a document explaining the SA undergraduate physiotherapy context was provided to panel members. Panel members were informed that data would be kept confidential and that they would remain anonymous to one another. Informed consent was obtained from all panel members prior to participation.

The closed-ended statements posed to panel members contained all the elements included in published frameworks focusing on curricular simulation integration that were identified during a systematic review performed by the principal researcher (AvdM). A modified Delphi survey was therefore used, as the closed-ended statements informing the survey were presented to the panel to indicate the value of the inclusion of each element in the conceptual framework.

An online survey research tool was used to distribute the Delphi survey, with a 2-week completion deadline per round. A continuous iteration process throughout the survey aimed to achieve shared understanding on the topic, illustrated as consensus reached on the inclusion or exclusion of each element presented in the statements. Subsequent survey rounds included statements failing to achieve 70% consensus, and incorporated panel members' comments.[14] Statements achieving consensus were removed from subsequent rounds. The Delphi survey continued until either consensus was achieved regarding the inclusion or exclusion of each statement, or data saturation was achieved. Data were analysed by the researcher (AvdM), after which an authors' consensus meeting was held to limit bias[14] and to ensure that all comments and suggestions were accurately incorporated during the subsequent rounds.

Data analysis

Consensus was defined as 70% or more of panel members agreeing on the inclusion or exclusion of a statement.[14,15] Stability was declared when individual panel member selections remained similar across survey rounds, with suggestions provided for the specific statement not resulting in further content or contextual changes, additions or omissions.[15] Either a convergence of panel members' opinions or individual response stability per statement was defined as data saturation being achieved.

Pilot study

One healthcare educationalist experienced in both SBE and the Delphi process was included in the pilot study, with only minor grammatical changes to the survey required.

Ethical approval

Following approval from the Health Sciences Research Ethics Committee of the Faculty of Health Sciences at the University of the Free State (ref. no. HSREC 108/2017), the first survey round was developed.

 

Results

Data saturation was declared after survey round 3. Four panel members dropped out during the Delphi survey, yielding an overall response rate of 73.3% (n=11). One panel member dropped out during round 1, two during round 2 and one during round 3. Reasons for dropout were not explored. In the final survey round, 63.6% (n=7) of panel members were South African, and 36.4% (n=4) international experts.

Data per statement were analysed as percentages to assess whether consensus had been achieved. Panel members provided limited justification regarding selected options or opinions related to statements, preventing content analysis of comments. Feedback to participants therefore included only the summary of statements achieving consensus, as the provision of statistical results with no supporting information could have yielded less accurate results.[15]

The following four themes, with supporting sections, emerged from the data: planning (n=12); implementation (n=3); evaluation (n=2); and revision (n=1). For the purposes of this article, statements achieving both consensus (Table 1) and stability (Table 2) relating to the implementation, evaluation and revision themes have been explored.

According to the panel members, students should be orientated regarding SBLE expectations and logistics (73%) and the feedback/debriefing process (82%) before participating in the SBLE. Statements relating to student participation in informal formative assessments prior to only technical skills-based SBLEs (75%) were viewed as not applicable, with theoretical testing prior to educator-identified SBLEs (64%) or pre- and post-SBLE participation (45.5%) remaining in dissensus. It was only deemed useful for students to set individual goals applicable to each SBLE (83%). On the contrary, no consensus was reached on the statement stating that students should be encouraged to revisit the individual learning outcomes and individual goals for each SBLE (64%). Panel members were of the opinion that identification of both the debriefing method (100%) and timing (85%) were essential prior to the SBLE, with debriefing being managed by a trained facilitator (85%).

Although programme evaluation should guide programme revision ('2%), which was in line with feedback from both educators (100%) and students (100%), the inclusion of informal evaluations (64%) and formative assessments (45.5%) for evaluation purposes did not reach consensus. Panel members viewed the alignment of the programme with national regulatory professional body requirements (77%), expert consensus and current literature (77%) as tools to validate the programme integrating simulation.

The data collected during the Delphi survey enabled the development of a conceptual framework for the integration of simulation in the SA undergraduate physiotherapy programme (Fig. 1).

 

Discussion

Student preparation through briefing and theoretical preparation is essential to enhance meaningful learning during an SBLE.[4] Briefing, from the authors' perspective, encompasses the preparation of all parties - students, facilitators, educators and support staff - involved in the planning and organisation of the SBLE.

As SA students enrolled in healthcare programmes originate from diverse backgrounds, and considering the limited SBLE integration in national healthcare education, orientation to the environment, learning objectives and outcomes, technology and learning methods is essential. Additionally, students might not be accustomed to the process of debriefing, or might not know how to receive and process constructive feedback. Student briefing at the beginning of the module and prior to the SBLE aims to guide the achievement of learning objectives.[6] Briefing further aims to establish a safe learning environment based on trust, integrity and respect,[5] and is therefore essential for preparing students for SBLEs. Prepared students will be comfortable in the simulated environment,[6] which in turn will enable more authentic engagement with the simulated scenario, and optimise learning.[5,16] The authors support the Delphi panel members' opinions regarding the inclusion of two student briefing sessions. One general briefing session should occur at the beginning of the module, and may be in written or recorded format[8] to allow for standardisation, save educator time and allow for follow-up clarification by students, with the second specific briefing provided face to face prior to each SBLE.[8

 

Research

The provision of basic theoretical knowledge required for participation in the SBLE was deemed essential by panel members, but did not achieve consensus. It is essential for students to be prepared prior to engaging in a SBLE, which should include theoretical preparation and ensure that participants are equipped with basic knowledge and skills to achieve SBLE objectives[5,8] and prevent cognitive overload. One panel member stated that SBLEs enable facilitators to identify theoretical needs, which could explain why the posed statement achieved only stability, as theory can be built upon or revisited throughout a module and programme, according to the needs that are identified.

All but one statement relating to the need for assessments before or after students participate in SBLEs remained in dissensus. A systematic review conducted by the principal researcher (AvdM) revealed that only frameworks relating to practical skills-based training included a mandatory pre- and/or post-test section,[16,17] which might be due to practical and procedural skills receiving more attention in the literature than nontechnical skills assessment.[18] Panel members were of the opinion that completing preparatory, informal formative assessments was required prior to practical skills-based SBLEs, but varied opinions were elicited as to when and for which SBLEs these assessments should be used. Informal knowledge testing prior to or after SBLEs is not meant to be interpreted as formal formative or summative assessment, but rather as part of the student preparation process. Even following concept clarification and rephrasing during the Delphi survey, this might have been unclear to some panel members.

In the authors' opinion, the inclusion of informal preparatory assessments would be best left to the educator's discretion, as pre-testing is only one method of encouraging student preparation.[5,8] A variety of other methods of student preparation, such as pre-reading texts and audiovisual material, are available, which could have contributed to statements not achieving consensus.

The majority of panel members were of the opinion that it could be useful to encourage students to set their own individual goals for the SBLEs prior to engaging in the SBLEs. Most SA undergraduate healthcare students are still emerging and developing the skills and traits of adult learning. According to Arnett's[19] definition, the majority of SA undergraduate healthcare students can be categorised as emerging adults, although it is not guaranteed that they will mature into adult learners, as demonstrated in a study on postgraduate nursing students.[3] As SA undergraduate students could be classified as emerging adult learners, requiring them to reflect on what they do not know might present challenges. The authors are of the opinion that the inclusion of individual goal setting might not be applicable to the conceptual framework for simulation integration, as some students are still developing their self-regulation skills. However, considering that self-regulation is an expected graduate attribute, students should be encouraged to individually revisit the learning objectives of each SBLE, thereby encouraging self-reflection and taking responsibility for their own learning.

Effective debriefing is viewed as the most important component of learning after an SBLE,[5,9,20] especially after immersive SBLEs, therefore requiring the use of formally trained[20] facilitators during the debriefing session. Panel members, in agreement with published literature, noted the necessity of student debriefing being led by a facilitator trained in the art. The facilitator fulfils the important role of facilitating the student in identifying inconsistencies between practical or skills experience and theoretical understanding, thereby promoting the conceptualisation of adjustments required for future implementation.

Unfortunately, emerging adult learners generally struggle with critical thinking skills,[21] leaving facilitators to assist students to develop such skills by guiding them in a positive reflection exercise, as reflection encourages critical thinking.[22] The diversity of the SA student population challenges the debriefing process. Not only does the facilitator have to be skilled in the process of debriefing, but also must have the ability to engage with a variety of students in a constructive manner to create a safe space for discussion and exploration. Most SA healthcare educators are still novices in the field of debriefing and simulation. Therefore, accredited courses for the development of debriefing skills are advised to ensure a standardised approach to debriefing. The debriefing process should explore students' experiences and thought processes, and facilitators should be encouraged to avoid regressing to a didactic lecturing approach.

Not only is facilitator training in the art of debriefing required, but training and preparation for the debriefing method and tools to be used when facilitating a SBLE debrief was confirmed as essential by panel members and the literature.[9,20] Simulation design best practices acknowledge that a debriefing method and timing should be identified during the SBLE design phase - a phase that might not occur during curriculum planning, but only later when individual SBLEs are designed.[8,23 The SBLE team composition must also be identified prior to the learning experience, as it has a direct effect on the type of debriefing procedure employed.

Debriefing may be done at three levels, namely the individual, in a team context or at class level, and careful attention should be paid to ensure that the debriefing method aligns with the purpose and intended objectives of the SBLE.[23] Taking into account the time challenges in an already full programme, consideration should be given to incorporating team debriefing, where appropriate. However, some learning may be lost when debriefing in larger groups, and this should be acknowledged. Therefore the authors suggest customising the level of debriefing for each SBLE.

Results obtained from the Delphi survey were similar to findings of previously published studies, and highlight the importance of programme evaluation[5,16] to prove the relevance and effect of simulation-based programmes, as required by administrators and funders. Unger and Hanekom[24] advocate for programme evaluation to safeguard the responsible use of resources, and evaluated the impact of a renewed curriculum on the skills and attributes displayed by graduating students. The present authors support the inclusion of programme evaluation strategies, and agree with the recommendation made by the Association for Simulated Practice in Healthcare that a SBE expert of the training institution oversees the design of a programme that integrates simulation. In light of the shortage of SA healthcare educators, it is proposed that interdepartmental collaboration among SBE experts should be encouraged, to assist development and evaluation of simulation integration and drive integration in the respective departments.

According to panel members, student, educator and facilitator feedback regarding satisfaction and perceived learning gained from the programme is essential, and aims to ensure the sustainability of such a programme.[7,16] Supporting evidence of the benefits of introducing simulation in a programme, based on improved results compared with traditional education strategy results, is required to advocate for programme continuation despite the cost involved. Delphi panel members regarded using informal evaluation tools (e.g. satisfaction surveys, verbal feedback or written reflection) for programme evaluation only as useful, with no consensus reached by panel members. However, when selecting programme evaluation methods, educators are advised that the choice of method should be based on the information required by stakeholders to motivate for continued funding. Therefore, a variety of evaluation formats may be used during the programme evaluation process. Quantitative methods (e.g. questionnaires and surveys) may be used to evaluate cost-effectiveness and satisfaction with the programme, where qualitative data obtained by means of personnel and student interviews could provide information on the perceived clinical relevance of the programme.

Delphi panel members considered the use of summative assessment results for programme evaluation to be useful, while statements on formative assessment only reached stability by the final survey round. It should be considered that several factors, including clinical training, experience and theoretical teaching, have an impact on a student's performance in a summative assessment. Assessments of a formative nature would not provide sufficient information regarding the attainment of learning outcomes, as the learning process is continuous and develops towards the desired outcomes, whereas summative assessments provide a specific measurement of outcome achievement. It has been stated that student assessments indicating learning outcome attainment are required for programme evaluation,[5] although the extent to which SBLEs have been integrated into a programme might influence the chosen evaluation tools.

Owing to the varied use of SBLEs in healthcare education, the authors view the design of programme evaluation methods incorporating feedback from all involved stakeholders, and possibly summative assessments, as essential during the planning phase to ensure comprehensive programme evaluation.

Owing to the relatively sparse governance of SBE by an overarching organisation,[4] with none currently reported in SA,[10] it is deemed essential by both the present authors and panel members to consult national regulating professional body requirements to validate the implemented programmed[17] Requirements, as expected by the Health Professions Council of SA and South African Qualifications Authority, should inform simulation practice standards to maximise the potential and acceptance of simulation use in healthcare education. Panel members were in agreement with the available literature,[3,4,17] considering it imperative to also incorporate best practice guidelines, current literature and expert consensus when aiming to validate a programme following simulation integration.

Feedback regarding stakeholder satisfaction with SBLEs is integral to identify programme limitations.[8] To ensure stakeholder satisfaction and educational improvement, programme adjustments should be made according to feedback from all stakeholders involved.[4,7]

New technologies and best practice guidelines are essential when adjusting a programme that integrates simulation. As technology continues to develop exponentially, healthcare simulation communities are required to adapt their processes[17] and training to ensure safe, high-quality education. These improvements may relate to less expensive equipment being produced, or improved availability of equipment that is more suited to the educator's individual requirements.

Description of the conceptual framework development

As illustrated in Fig. 1, the conceptual framework depicts three themes that influence one another, with solid line arrows in the figure indicating cause and effect[25] of themes on each other. Elements encased in solid boxes depict the themes in which the elements are most suited.[25] The fourth theme, revision, is encased by dashed lines, with dashed lines linking revision to all themes, as potential adjustments[26] that are required and may be applicable to any theme or individual element and, therefore, the theme of revision feeds into all sections. A back-and-forth movement between evaluation and revision is represented by means of two-directional arrows, as adjustments brought about by evaluation would require re-evaluation to ensure a more satisfactory programme outcome.

 

Conclusion

Considering an underprepared SA student population and sparse SBLE integration, preparing students for the simulated environment, expectations and feedback/debriefing processes is essential to ensure learning outcome achievement. The exponential growth in simulation-based research and technologies requiring integrated SBLEs to adhere to current best practices is applicable to the context in which the programmes are presented and address stakeholder feedback. Considering limited resources at SA tertiary institutions, SBE experts should aim to adjust programmes to ensure that learning outcomes are achieved with the least cost incurred. To ensure its practicality and credibility for the SA context, the conceptual framework was subjected to a validation meeting that will be presented in a separate article.

Declaration. This article is based on research conducted by AvdM as part of a PhD degree in physiotherapy.

Acknowledgements. Dr Daleen Struwig, medical writer/editor, Faculty of Health Sciences, University of the Free State, for technical and editorial preparation of the manuscript.

Author contributions. All authors contributed to the article. AvdM developed the protocol and collected the data for the larger study from which this research emanated. AvdM wrote the first draft of the manuscript; RB and ML contributed to the interpretation of the data and writing of the article; AvdM made the final editorial adjustments to the manuscript. All the authors approved the final version of the article.

Funding. National Research Foundation (grant no. TTK180418322303); Health and Welfare Sector Education and Training Authority.

Conflicts of interest. None.

 

References

1. Jansen J. The future prospects of South African universities. Policy and funding options. Viewpoints No. 1, 2018. https://www.cde.org.za/wp-content/uploads/2018/06/Viewpoints-The-future-prospects-of-South-African-Universities-Jonathan-Jansen.pdf (accessed 25 February 2021).         [ Links ]

2. Lange L. 20 Years of higher education curriculum policy in South Africa. J Educ 2017(68):31-58. https://doi.org/10.17159/2520-9868/i68a01        [ Links ]

3. Spies C, Seale I, Botma Y. Adult learning: What nurse educators need to know about mature students. Curationis 2015;38(2):1494. https://doi.org/10.4102/curationis.v38i2.1494        [ Links ]

4. Nestel D, Gough S. Designing simulation-based learning activities: A systematic approach. In: Nestel DM, Kelly B, Jolly B, Watson M, editors. Healthcare Simulation Education: Evidence, Theory and Practice. Hoboken: Wiley Blackwell, 2018:135-142.         [ Links ]

5. Association for Simulated Practice in Healthcare. Simulation-based education in healthcare: ASPiH standards framework and guidance. ASPiH, 2016. https://aspih.org.uk/standards-framework-for-sbe/ (accessed 25 February 2021).         [ Links ]

6. Jeffries PR, Rogers B, Adamson K. NLN Jeffries Simulation Theory: Brief narrative description. Nurs Educ Perspect 2015;36(5):292-293. https://doi.org/10.5480/1536-5026-36.5.292        [ Links ]

7. Motola I, Devine LA, Chung HS, Sullivan JE, Issenberg SB. Simulation in healthcare education: A best evidence practical guide. AMEE Guide No. 82. Med Teach 2013;35(10):e1511-e1530. https://doi.org/10.3109/0142159X.2013.818632        [ Links ]

8. INACSL Standards Committee. INACSL Standards of best practice: Simulation. Simulation design. Clin Simul Nurs 2016;12(Suppl):S5-S12. https://doi.org/10.1016/j.ecns.2016.09.005        [ Links ]

9. INACSL Standards Committee. INACSL Standards of best practice: Simulation. Debriefing. Clin Simul Nurs 2016;12(Suppl):S21-S25. https://doi.org/10.1016/j.ecns.2016.09.008        [ Links ]

10. Swart R, Duys R, Hauser, ND. SASS: South African Simulation Survey - a review of simulation-based education. South Afr J Anaesth Analg 2019;25(4):12-20. http://www.sajaa.co.za/index.php/sajaa/article/view/2191 (accessed 25 February 2021).         [ Links ]

11. Centre for Educational Testing for Access and Placement. The national benchmark tests national report: 2018 intake cycle. CETAP, 2018. https://nbt.ac.za/sites/default/files/NBTPReport_2018.pdf (accessed 25 February 2021).         [ Links ]

12. Welman A, Spies C. High fidelity simulation in nursing education: Considerations for meaningful learning Trends Nurs 2016;3(1):1. https://doi.org/10.14804/3-1-42        [ Links ]

13. Van der Merwe A, Barnes RY, Labuschagne MJ. How to plan for simulation integration into undergraduate physiotherapy training. Afr J Health Prof Educ 2022;14(2):61-65. https://doi.org/10.7196/AJHPE.2022.v14i2.1446        [ Links ]

14. Avella JR. Delphi panels: Research design, procedures, advantages, and challenges. Int J Doct Stud 2016;11:305-321. https://doi.org/10.28945/3561        [ Links ]

15. Slade SC, Dionne CE, Underwood M, et al. Consensus on Exercise Reporting Template (CERT): Modified Delphi study. Phys Ther 2016;96(10):1514-1524. https://doi.org/10.2522/ptj.20150668        [ Links ]

16. Zevin B, Levy JS, Satava RM, Grantcharov TP. A consensus-based framework for design, validation, and implementation of simulation. J Am Coll Surgeons 2012;215(4):580-586. https://doi.org/10.1016/j.jamcollsurg.2012.05.035        [ Links ]

17. Khamis NN, Satava RM, Alnassar SA, Kern DE. A stepwise model for simulation-based curriculum development for clinical skills, a modification of the six-step approach. Surg Endosc 2016;30(1):279-287. https://doi.org/10.1007/s00464-015-4206-x        [ Links ]

18. Watkins SC, Roberts DA, Boulet JR, McEvoy MD, Weinger MB. Evaluation of a simpler tool to assess nontechnical skills during simulated critical events. Simul Healthc 2017;12(2):69-75. https://doi.org/10.1097/SIH.0000000000000199        [ Links ]

19. Arnett JJ. Emerging adulthood. A theory of development from the late teens through the twenties. Am Psychol 2000;55(5):469-480. https://doi.org/10.1037/0003-066X.55.5.469        [ Links ]

20. Cheng A, Eppich W, Sawyer T, Grant V. Debriefing: The state of the art and science in healthcare simulation. In: Nestel DM, Kelly B, Jolly B, Watson M, editors. Healthcare Simulation Education: Evidence, Theory and Practice. Hoboken: Wiley Blackwell, 2018:158-164.         [ Links ]

21. Hussin AA. Education 4.0 made simple: Ideas for teaching. Int J Educ Lit Stud 2018;6(3):92-98. https://doi.org/10.7575/aiac.ijels.v.6n.3p.92        [ Links ]

22. Dorn RL. How reflection prompts impact critical thinking skills. Doctoral thesis. Tallahassee: Florida State University, 2014. http://diginole.lib.fsu.edu/islandora/object/fsu%3A185235 (accessed 25 February 2021).         [ Links ]

23. Sabus C, Macauley K. 2016. Simulation in physical therapy education and practice: Opportunities and evidence-based instruction to achieve meaningful learning outcomes. J Phys Ther Educ 2016;30(1):3-13. https://doi.org/10.1097/00001416-201630010-00002        [ Links ]

24. Unger M, Hanekom SD. Benefits of curriculum renewal: The Stellenbosch University physiotherapy experience. Afr J Health Prof Educ 2014;6(2)(Suppl 1):S222-S226. https://www.ajol.info/index.php/ajhpe/article/view/113494 (accessed 25 February 2021).         [ Links ]

25. Malamed C. Visual language for designers. Principles for creating graphics that people understand. Beverly: Rockport Publishers, 2011.         [ Links ]

 

 

Correspondence:
A van der Merwe
gonzalesa@ufs.ac.za

Accepted 10 August 2021

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License