SciELO - Scientific Electronic Library Online

 
vol.71 número3 índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Dental Journal

versão On-line ISSN 0375-1562
versão impressa ISSN 0011-8516

S. Afr. dent. j. vol.71 no.3 Johannesburg Abr. 2016

 

RESEARCH

 

Comparison of a custom made electronic record book database with a traditional student record book for recording clinical procedural credits and continuous clinical assessments in Restorative Dentistry

 

 

V BookhanI; FA de WetII; PD BrandtIII

IBDS, M Dent (Prosthoclontics). Department of Odontology, School Of Dentistry, University of Pretoria, South Africa
IIBChD, BChD (Hons), MDent, DTO, DSc (Odont). Department of Odontology, School Of Dentistry, University of Pretoria, South Africa
IIIBChD, PGDipDent (Aesthetic Dentistry), MSc (Odontology), Adv. Dip (Aesthetic Medicine). Department of Odontology, School Of Dentistry, University of Pretoria, South Africa

Correspondence

 

 


ABSTRACT

INTRODUCTION: Comparison of a custom designed electronic record book database with a traditional student record book in Dentistry has not been documented.
AIM: To develop an electronic record book database (ERBD) to record and calculate continuous clinical assessment (CCA) marks of students in Restorative Dentistry and to compare the efficiency of the ERBD system with the traditional student record book (TSRB).
METHODS: Data was obtained from 1276 dental procedures performed by fifty five consenting final year students. Clinical supervisors and students were calibrated to record credits and CCA marks on a designated assessment form. In practice, the recorded data were manually transferred to the TSRB on a daily basis. The ERBD was designed as an electronic Excel® spreadsheet which enabled daily automatic calculating and updating of credits and CCA marks for each student. After a month the times taken to transfer these data from the TSRB and the ERBD to electronic class lists were recorded in minutes and analysed using the Student's t-test.
RESULTS: Significant differences (p < 0.0001) between the times were recorded. : The administrative procedure was 14 times faster when the ERBD was used.
CONCLUSION: The ERBD was significantly more efficient than the TSRB.


 

 

INTRODUCTION

The benefits and educational impact of a traditional paper-based student record book (TSRB) has been documented in the literature.1-8 In health sciences, it has been used as an educational tool for teaching and learning, and for recording undergraduate clinical procedures and as-sessments.1,2 Clinical experiences may be documented and the clinical performance of students monitored.3 Whilst there is no golden standard against which a TSRB can be compared, the following qualities have been suggested: it should be feasible, efficient, accurate, inexpensive, accepted by supervisors and students, record valid and reliable data, allow frequent educational interaction between supervisors and students and provide students with relevant feedback regarding their clinical progress and assessments.1,2,5,7-10

This study, applicable to dental education, compares a custom-designed electronic record book database (ERBD) with a paper-based TSRB, to determine which is more efficient for recording, calculating and updating procedural credits and continuous clinical assessment (CCA) marks. There appears to be no similar study reported in the literature.

Restorative Dentistry at the University of Pretoria has made good use of the TSRB. Clinical progress is monitored periodically by calculating and updating the total procedural credits and average CCA marks. This is done usually twice during the academic year preferably when students are on vacation. There are several reasons that prevent more frequent collection of the TSRB from each student. The organizational process was extremely inefficient, resulting in failure. The academic staff have clinical, administrative, teaching and research commitments during the year and perforce are obliged to manage the huge task of processing the TSRB data during vacations. Students are reluctant (only 30% comply) to submit their books for auditing, indicating that they prefer to be on vacation when the audit occurs to avoid immediate confrontation with their teachers. Clinical interventions are constantly being performed throughout the academic year and frequent collection of the TSRB would interfere with the favourable practice of an immediate recording of student performance.

Once the books are assembled, processing the data entails counting the total number of credits recorded on each page of the TSRB and calculating the average CCA marks. The exercise (employed by many academic institutions) is extremely inefficient due to the large number of clinical procedures, procedural credits and CCA marks recorded by each student. An innovative solution to reduce the workload associated with this arduous time-consuming administrative task would be to create and develop a faster and more efficient alternative. Hence the motivation and need for this study.

Therefore the aims of this study were to:

1. create and develop a custom designed ERBD to record and update the clinical procedural credits and to automatically calculate the CCA marks of final year undergraduate students in Restorative Dentistry.

2. compare the efficiency of the administrative process using the ERBD with that of the paper-based, traditional student record book (TSRB) method.

Efficiency

The definition of efficiency for the purpose of this study is the combined time (in minutes) to calculate:

1. the total number of procedural credits for each student,

2. the average CCA mark of each student and

3. the transfer of the total number of procedural credits and average CCA mark of each student to an electronic Class List spreadsheet in Microsoft® Excel.

The sequence will be referred to as the administrative process.

 

METHODS

This descriptive, cross sectional study utilized data obtained from 1276 undergraduate dental procedures (recorded on the assessment forms and the traditional student record books) performed during the month of February of the academic year by fifty five (n = 55) consenting BChD V students ranging in age from 18 to 24 years. To ensure reliability, clinical supervisors and students were calibrated in the recording of clinical procedures, procedural credits and CCA marks on an assessment form and in the TSRB. This took place at a calibration workshop before the commencement of the academic year. The times taken to create and develop the assessment form, the TSRB and the ERBD were outside the aims and were not considered.

Creation and development of the assessment form

A Restorative Dentistry assessment form for criteria referenced, student self assessment was created and developed in Microsoft® Word under the guidance of the module coordinator (responsible for coordination of the undergraduate module program and curriculum as well as all formative and summative assessment procedures). Validity and reliability were tested and proven.11,12

Creation and development of the TSRB

The TSRB (Figure 1) was created and developed in Microsoft® Word by the collaborative efforts of clinical supervisors and students over several days. Each page listed a clinical procedure and the associated procedural code as well as a column for recording the procedural credits and the associated CCA mark.

The information (procedures, procedural codes and CCA marks) recorded on the assessment form was transferred to the TSRB (hand written by a clinical supervisor and witnessed by the student during a clinical session), on a daily basis. After a month each student was contacted tel-ephonically or via e-mail with the request that they should submit their TSRB to the module coordinator on a specific day between 07:00-17:00.

Books were delivered by individuals, or by small groups of students or were couriered, and full collection took approximately eight hours. The books were then arranged in alphabetical order according to the class list and preparations were made for data processing.

The module coordinator started a digital stopwatch once a TSRB was opened and the recording was stopped after the administrative procedure, as described above, was completed. This was repeated for each TSRB. The results for each student were calculated and transferred to an e-class list (Figure 2). The times in minutes for each procedure were recorded on an e-spreadsheet using Microsft® Excel software and saved on a personal computer.

Creation and development of the ERBD

The ERBD was created and developed by the module coordinator on a personal computer using Microsoft Excel® software, basing the e-version on the format of the TSRB.

The process involved seven steps:

i) opening a Microsoft Excel® document,

ii) saving the student class list on the electronic spreadsheet,

iii) addition of the codes of the clinical procedures to the relevant cells

iv) addition of the mathematical formulae to the cells responsible for the appropriate calculations,

v) addition of the academic calendar days

vi) copying the master electronic spreadsheet to provide a spreadsheet for each student and

vii) the allocation of the student numbers to each electronic spreadsheet.

The process required less than an hour. The 55 spreadsheets were designed using a calendar format.

Average CCA mark

Rows on the spreadsheet recorded the day and date on which clinical procedures had been performed. Columns on the spreadsheet provided cells for the date-related recording of clinical procedures performed, the assessment mark allocated (0 to 5) and for calculations to derive the CCA percentage marks and progressive CCA percentage averages (Figure 3).

Total number of procedural credits earned

Total credits for each day were recorded in the cells of a dedicated Total column and enabled the automatic calculation of a monthly total, taking into account specific weightings for each procedure.

Compilation of the ERBD

Spreadsheets were prepared for each month with dates and days appropriately recorded (excluding weekends and holidays). The 55 spreadsheets formed the ERBD for the 55 students and was saved in a folder on the hard drive of the computer and backed up on a removable flash drive. To maintain anonymity e-spreadsheets were allocated student numbers and the ERBD workbook was password protected as a read-only document.

The procedural credits and CCA mark of each student was entered onto his/her dedicated e-spreadsheet (Figure 3), by the module coordinator on a daily basis. The total number of procedural credits and average CCA were automatically calculated and updated daily. During the month of this study the ERBD was e-mailed to supervisors and students every day for their reactions and comments.

Feedback from supervisors to students occurred via written e-mail using the university IT network service provider and feedback from students to supervisors was conveyed by written e-mail using the IT network service provider preferred by the student. After a month, the total number of procedural credits and the average CCA mark for each student were transferred to an e-class list (similar to the e-list in Figure 2), by the module coordinator, who started the digital stopwatch once the ERBD file was opened and stopped the digital stopwatch after the administrative procedure was completed for each e-spreadsheet, recording the time taken in minutes. The times necessary for the administrative procedures using the TSRB and when using the ERBD were recorded on a Microsoft® Excel e-spreadsheet and saved on a personal computer. The results were analysed using the Student's t-test.

 

RESULTS

The study assessed the relative efficiencies of two systems of recording student performance by comparing the relative times taken to determine performance levels. The time taken to calculate and update the total class procedural credits and average CCA marks using the ERBD was 27.69 minutes (average = 30.21 seconds per student) compared with the 431.20 minutes (average = 7.48 minutes per student) taken using the TSRB. The ERBD process was 14 times faster than the TSRB.

These calculations were done automatically by the programmed ERBD. The information recorded on the assessment form and on the ERBD indicated that the total number of procedural credits and CCA marks recorded in the two systems did not differ significantly. The information recorded on the ERBD was accurate. The average procedural credits for the class was 23.2 credits and the average CCA mark was 57%.

The results for the comparison of efficiencies are illustrated in Figure 4 and Figure 5 as line charts on a logarithmic axis. The minutes were converted to seconds to emphasize the significant differences (p < 0.0001) between the times required to complete the administrative procedure of the calculation of the procedural credits and the CCA marks for each student. The line graphs clearly illustrate that the use of the ERBD for this purpose was notably more efficient than using the TSRB.

 

DISCUSSION

Properly constructed assessment tools that drive learning, including student record books, are effective educational tools,1,2,8,9 bringing structure and focus to the process of learning.1 Students in the new millennium (Millenials) prefer the use of innovative electronic technology for their assessment and education.13-20 Electronic innovations in dentistry that encourage and motivate the learning process are essential to the progress of dental education6 especially those that encourage feedback during supervisor and student interaction.21-27 The paper -based TSRB makes it difficult to achieve effective feedback to students, restricting frequent or sufficient educational interaction between supervisors and students, without which remediation cannot be successfully accomplished.

The accessibility of the TSRB did not differ significantly from the ERBD. However the method of the submission protocol followed by the students was extremely inefficient and inconvenient for the module coordinator. It took approximately eight hours to collect every TSRB compared with the ERBD that was easily accessible in less than five minutes on the computer and required a more convenient submission protocol. The assessment forms were handed by the supervisors after each clinical session directly to the module coordinator who entered the procedural credits and CCA mark onto the electronic spreadsheet of the ERBD. The ERBD updated the total credits and average CCA automatically. The relative complexity of the TSRB allowed the clinical progress and clinical performance of a student to be monitored only twice a year, delaying the identification of students who were underperforming. Therefore the time for intervention and remediation strategies was insufficient to allow supervisors to help students prepare adequately for future assessment evaluations.

Verbal interviews with all supervisors and students indicated that they preferred the ERBD. The system was seen to be feasible, accessible and allowing the administrative process of calculating and updating student procedural credits and CCA marks to progress efficiently. It could be e-mailed as an attachment, and students were enabled to interact electronically and frequently with supervisors about their procedural credits earned and CCA marks.

The e-communications allowed supervisors to provide students with specific reasons for the assessments which had been given. Blended learning opportunities were available and encouraged the process of communication without any hindrance of feeling intimidated. The ERBD also allowed the effective monitoring of student attendance of clinical sessions. Students who were absent from the clinical session without a valid explanation or prior notice were allocated a CCA mark of zero. Unacceptable attendance patterns of students could be identified and ntervention strategies (student tutoring and discussion forums) implemented to prevent future problems The zero CCA mark motivated students to present themselves at all clinical sessions.

Students performing below class average were also easily identified and intervention strategies and remediation by supervisors could be implemented early during the academic year. Intervention strategies included clinical guidance, practical exercises and group discussions to promote deeper learning. The ERBD allowed students to monitor their own progress and compare their progress to the progress of their colleagues. Allowing students to peruse the anonymous e-spreadsheets of their colleagues allowed them to experience self-reflection and self-realization. These experiences are essential for the promotion and encouragement of deeper learning amongst students and can influence changes in their behaviour.17 Changes that encourage students to drive the learning process may motivate them to achieve their required exit level outcomes and to graduate as competent clinicians.

Only one supervisor was responsible for entering all the procedural credits and CCA marks onto the ERBD and this may have contributed to the proven accuracy of the data, an essential requirement.2830These records are important in Dentistry because they contribute to the assessment of each student and determine whether they can and/or will achieve clinical competency.

Furthermore the design of the ERBD is simple, extremely cost effective and easy to replicate. The Excel® workbook is user-friendly, familiar to supervisors and students and compatible with any computer using Microsoft Windows as an operating system. It is also uncomplicated and will not compromise the operating efficiency of computers on a local area network (LAN) network. Further studies are necessary to determine the benefits and educational impact of the ERBD in other disciplines of Dentistry.

 

CONCLUSION

In summary the ERBD was more beneficial than the TSRB because:

1. it reduced the administrative workload of calculating procedural credits and CCA marks.

2. it is an e-version of the TSRB, is accessible via the internet (e-mail) and allows the clinical progress, CCA marks and attendance of students to be more frequently monitored by supervisors on a personal computer.

3. it allows students to monitor their formative (procedural credits earned) and summative clinical progress (CCA marks) and compare their clinical progress with their colleagues. This allows self reflection.

4. it allows frequent and regular feedback (via e-mail) between supervisor and student in relation to their clinical progress (procedural credits) and clinical performance (CCA marks).

5. it helps motivate students by introducing the self reflection learning experience that results in the self realization of their competency. This encourages students to strive to improve clinical progress and performance in order to achieve success so that they will graduate as competent practitioners.

6. it complies with university assessment guidelines and identified students that were performing below average and allowed supervisors enough time to provide students with effective remedial support and learning opportunities so that they could progress to their next academic experience.

7. The ERBD possesses the qualities of an "ideal record book" as suggested in the health education literature.1,2,3

Applications in private practice

The ERBD philosophy could be used in the private sector as an inexpensive method to monitor, calculate and update the total number of procedures performed daily, monthly and annually as well as to calculate the income and expenditure (including monies due to the SARS) of the practice or practitioner. The design of the mathematical formulae and functions may be requested from the Corresponding Author.

Conflict of interest: None declared

 

Acknowledgements

Prof. HS Schoeman for statistical analysis of results.

 

ACRONYMS

CCA: Continuous Clinical Assessment

e: electronic

ERBD: Electronic Record Book Database

TSRB: Traditional Student Record Book

 

References

1. Torabi K, Bazrafkan L, Sepehri S, Hashemi M. The effect of logbook as a study guide in dentistry training. J Adv Med Educ & Profession 2013; 1(3): 81- 4.         [ Links ]

2. Khorashadizadeh F, Alavina SM. Students' perception about logbooks: advantages, limitation and recommendation-a qualitative study. J Pak Med Assoc 2012; 62(11): 1184-6.         [ Links ]

3. Alireza Y, Shayan S, Mosavi A. Developing a clinical performance logbook for nursing students receiving cardiac care field training. J Educ Health Promot 2012; 1(7): 1-7.         [ Links ]

4. Yaghobian M, Fakhri M, Salmeh F, Yaghobi T, Zakizad M, Shahmohammadi S. Assessment of the effect of log book on nursing and midwifery students' clinical skills. Middle-East J. Sci. Res. 2011; 7(6): 896-902.         [ Links ]

5. Steiger S, Praschinger A, Kletter K, Kainberger F. Learning objectives in logbooks as indicators of problems in teaching hospitals. J Med and Bio Sc 2009; 3(1): 1-6.         [ Links ]

6. Saber M, Saberi Fiouzi M, Azizi F. The logbook effect on clinical learning of interns in internal ward rotation in Shiraz University of Medical Sciences. J Med Educ 2008; 12(3): 62-6.         [ Links ]

7. Denton GD, DeMott C, Pangaro LN, Hemmer PA. Narrative review: use of student-generated logbooks in undergraduate medical education. J Teach. Learn. Med 2005; 18(2): 153-64.         [ Links ]

8. Dahllöf G, Tsilingaridis, Hindbeck H. A logbook for continuous self-assessment during one year in paediatric dentistry. Eur J Paed Dent 2004; 3: 163-9.         [ Links ]

9. Patil NG, Lee P. Interactive logbooks for medical students: are they useful. Med Educ 2002; 36: 672-77.         [ Links ]

10. Chadwick RG, Mason AG. Development, application and effectiveness of a novel logbook checklist assessment scheme in conservative dentistry. Eur J Dent Educ 1997; 1: 176-80.         [ Links ]

11. Bookhan V, Becker LH, Oosthuizen MP. A comparison of continuous clinical assessment and summative clinical assessment in restorative dentistry. SADJ 2007; 62(6): 258-62.         [ Links ]

12. Bookhan V, Becker LH, Oosthuizen MP. Criteria referenced student self-assessment in restorative dentistry. SADJ 2005; 60(4): 161-66.         [ Links ]

13. McCann AL, Schneiderman ED, Hinton RJ. Teaching and learning preferences of dental and dental hygiene students. J Dent Educ 2010; 74: 65-78.         [ Links ]

14. Vvyas R, Tharion E, Sathishkumar S. Improving the effectiveness of physiology record books as a learning tool for first-year medical students in India. Adv. Physiol. Educ 2009; 33: 329-34.         [ Links ]

15. Mattheo N, Stefanovic N, Apse P, Attstrom R, Buchanan J et al. Potential of information technology in dental education. Eur J Dent Educ; 2008; 12(1): 85-91.         [ Links ]

16. John JH, Thomas D, Richards D. Computers in general practice. Br Dent J 2003; 195(10): 585-90.         [ Links ]

17. Atkinson JC, Zeller GG, Shah C. Electronic patient records for dental school clinics: more than a paperless system. J Dent Educ 2002; 66(5): 634-42.         [ Links ]

18. Strother EA, Brunet DP, Bates ML, Gallo JR. Dental students attitudes towards digital textbooks. J Dent Educ 2009; 73: 1361-5.         [ Links ]

19. Pinto A, Selvaggi S, Sicignano G, Vollono E, Iervolino, Amato F, Molinari A, Grassi R. E-Learning tools for education: regulatory aspects, current applications in radiology and future prospects. Radiol Med 2008 113: 144-57.         [ Links ]

20. Oakley M. Spallek H. Social Media in dental education: a call for research and action. J Dent Educ 2012; 76(3): 279-87.         [ Links ]

21. Taylor C, Grey N, Satterhwaite D. Assessing the clinical skills of dental students: a review of the literature. J Educ. Learn. 2013; 2(1): 20-31.         [ Links ]

22. Subramanian J, Anderson VR, Morgaine KC, Thomson WM. The importance of the student voice in dental education. Eur Dent J Educ 2013; 17(1): 136-41.         [ Links ]

23. Vernazza C, Durham J, Ellis J, Teasdale D, Cotterill S, Scott L, Thomason M, Drummond P, Moss J. Introduction of an e-portfolio in clinical dentistry: staff and student views. Eur J Dent Educ 2011; 15: 36-41.         [ Links ]

24. Kramer GA, Albino JE, Andrieu SC, Hendricson WD, Henson L, Horn BD, Neumann LM, Young SK. Dental student assessment toolbox. J Dent Educ 2009; 73(1): 12-35.         [ Links ]

25. Divarris K, Barlow PJ, Chendea SA et al. The academic environment: the students' perspective. Eur J Dent Educ 2007; 12(1): 120-30.         [ Links ]

26. Tenant M, Kruger E. Early intervention surveillance strategies (EISS) in dental student clinical performance: a mathematical approach. J Dent Educ 2005; 69(12): 1353-8.         [ Links ]

27. Stewart CJ, Moloney EJ, Kinirons MJ. Clinical experiences of undergraduate dental students in pediatric dentistry at Cork University Dental School and Hospital, Ireland. J Dent Educ 2010; 74: 325 - 30.         [ Links ]

28. Dosmumu EB, Dosumu OO, Lawal FB. Quality of record keeping by undergraduate dental students in Ibadan, Nigeria. Ann Ibd Postgrad Med 2012; 10(1): 13-7.         [ Links ]

29. Raghoebar-Krieger HM, Sleiffer D, Bender W, Stewart RE, Popping R. The reliability of logbook data of medical students: an estimation of inter-observer agreement, sensitivity and specificity. Med Educ 2001; 35(7):624-31.         [ Links ]

30. Thomson PJ, Boyle CA. Auditing clinical teaching in oral surgery: the use of a student log book. Dent Update 1996; 23(7): 283-6.         [ Links ]

 

 

Correspondence:
Vinesh Bookhan
Department of Odontology, School Of Dentistry
University of Pretoria, South Africa
Tel: +27 12 319 2277, Fax: +27 12 548 2864.
E-mail: Vinesh.Bookhan@up.ac.za