SciELO - Scientific Electronic Library Online

 
vol.78 número10 índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Dental Journal

versão On-line ISSN 0375-1562
versão impressa ISSN 0011-8516

S. Afr. dent. j. vol.78 no.10 Johannesburg Nov. 2023

http://dx.doi.org/10.17159/sadj.v78i10.16674 

RESEARCH

 

Dental undergraduate students' perspectives of online assessments conducted during the Covid-19 pandemic - a report from one South African university

 

 

I MoodleyI; S SingII; R MoodleyIII

IBDT, Masters in Dental Public Health (UWC), PhD (UKZN), Senior Lecturer, Acting Academic Leader of Discipline of Dentistry, University of KwaZulu-Natal, Durban, South Africa ORCID: https://orcid.org/0000-0001-5834-9887
IIB.OH (UDW), MSc (UWC), PG Dip (Health Res Ethics) (Stell), PhD (UWC), PhD (Clin & Res Ethics) (Stell), Discipline of Dentistry, University of KwaZulu-Natal, Durban, South Africa ORCID: https://orcid.org/0000-0003-4842-602X
IIIBDT, Masters in Dental Public Health (UWC), PhD (UKZN), Senior Lecturer, Discipline of Dentistry, University of KwaZulu-Natal, Durban, South Africa ORCID: https://orcid.org/0000-0003-2703-9370

Correspondence

 

 


ABSTRACT

INTRODUCTION: Online assessments are commonly used in health sciences curricula worldwide. However, it is unclear on how undergraduate dental therapy and oral hygiene students at a South African university responded to the transition from traditional classroom-based to online assessments, as a result of the Covid-19 pandemic
AIMS AND OBJECTIVES: This paper reports on students' knowledge and practices of, attitudes towards and preparedness for online assessments
METHOD: This was a descriptive study, using a mixed methods approach to obtain quantitative and qualitative data through an online questionnaire
RESULTS: This study indicated most students (n=93, 87%) were familiar with online assessments; however, only (n=68, 63.5%) were confident about taking these tests. Most students agreed that online assessment helped them grasp all aspects of theory, while less than half of third-year students agreed that online assessments helped them to integrate theory into clinical practice. The reported challenges were connectivity problems with online assessment and insufficient time to complete online tests
CONCLUSION: This study demonstrated that undergraduate dental students were familiar with online assessments and were confident about taking them. They believed this helped them grasp all aspects of theory despite specific challenges associated with the use of online assessments. This study suggests that online assessments could be a valuable method in measuring student competency of fundamental theoretical aspects of dentistry

Keywords: Online assessments, undergraduate dental students, Covid-19 pandemic


 

 

INTRODUCTION

The Covid-19 pandemic heralded a new era in which people across the world had to quickly adjust to a new way of life while maintaining social distances and avoiding large gatherings.1 In prohibiting large gatherings, higher educational institutions too were faced with the unprecedented situation of closing their doors to face-to-face student learning, practical/ clinical teaching and written examinations.2 Most universities transitioned to online platforms to continue their teaching and learning programmes during the pandemic, but academics were concerned about student assessments.3 Assessment is a key component in education which defines a student's mobility through a tertiary programme by means of rigorous formative and summative assessments. While formative assessment strives to improve teaching and learning, summative assessment aims to quantify the overall performance of a student.4 However, traditional methods of assessment, comprising mainly of classroom-based written tests and exams, were not viable options in light of restrictions in human movement. Furthermore, assessment In health professionals' training is critical in determining how well a student has grasped the fundamental theory and connecting this theory to practice through professional decision making in a clinical setting.5,6

Although South African universities adopted online platforms for teaching and learning and sourced alternatives to on-site classroom-based tests and examinations, clinical assessments remained a huge challenge. In undergraduate dental training, clinical competency is achieved through practice application of specified repetitive dental clinical procedures to develop technical skills and mastery.7 Due to Covid-19, students were not able to perform these clinical tasks. Consequently, the measurements of such skills through clinical assessments could not be conducted and more reliance was placed on online assessments.

In facilitating online teaching and learning, the university ensured that no student was left behind by providing students with data bundles on a monthly basis and introduced a new online learning portal with diverse resources to assist students in the transition.8 However, at the implementation level online assessments were adopted with uncertainty and confusion by both academics and students. Academics set online tasks according to their expertise and convenience with the assumed compatibility of students.9 In embracing online assessments, academics further grappled with using different assessment methods without sacrificing quality, appropriateness and fairness.10 This has also raised a question on students' challenges with preparedness, acceptance and ability to navigate through these assessments' tasks. Furthermore, a digital divide exists among the university students, where some students are better equipped and experienced than students who are disadvantaged in terms of computer and technology skills.11 Moreover, some students struggle with understanding the scientific terms and nuances of the English language within the limited time frame of an online assessment when English is their second language.11

While a blended learning approach was widely supported for health sciences training in South Africa via the online learning management platforms such as Moodle, it is unclear on how undergraduate dental therapy and oral hygiene students responded to the transition from a traditional method of assessment to online assessments during the Covid-19 pandemic. There is limited published evidence on the contextual influences impacting online assessments in dental undergraduate training during Covid-19 and students' preparedness for this type of assessment. Such information could be critical in guiding and shaping undergraduate dental curriculum development, specifically when responding to sudden disruptions in the teaching and learning environment. Therefore, this study aims to contribute to curriculum planning and review by determining undergraduate dental students' knowledge and practices of, attitudes towards and preparedness for online assessments during Covid-19. In doing so, it could also be ascertained whether online assessments could have a place in undergraduate dental training post-Covid.

 

METHODS

Research setting and context

The study was conducted among students in a Dentistry discipline in South Africa which offers two three-year graduate degree programmes, namely Bachelor of Dental Therapy and Bachelor of Oral Hygiene. The survey was administered at the end of the second semester in 2020 upon completion of all assessments. Ethics approval (Ref. No. HSSREC/00001601/2020) and gatekeeper permission was obtained prior to commencing the study.

Research design

A descriptive, questionnaire-based study using a mixed methods approach was used. The study employed a dominant status design (QUAN/quali) that investigated the knowledge, attitudes, perceptions and practices of undergraduate dental students regarding online assessment conducted during the Covid-19 pandemic.12 The study was designed to obtain quantitative and descriptive qualitative data through an online questionnaire with closed-ended and open-ended questions.

Participants

The study was conducted among full-time students (n=156) registered in the Discipline of Dentistry for the academic year 2020 including year 1 (n=55, B. Dental Therapy n=38, and B. Oral Hygiene n=17), year 2 (n=54) and year 3 (n=47) students. The Oral Hygiene programme only commenced in 2020, hence the reason for only first-year student participation. Participants were recruited using the social media platform WhatsApp, through a snowball sampling technique.13 A message, with an invite to participate in a study, was sent to the first student. The message included a link to the informed consent documents and survey questionnaire. Once the student clicked on the link, he/ she had to give consent by clicking on the necessary icon. The participant was then given an option to complete the survey and, on completion, he/she was also given an option to forward the survey link to another student with each participant remaining anonymous. This link stayed opened for approximately six weeks to allow students to participate.12

Data collection and analysis

Data was collected using an online, self-administered questionnaire to obtain a better understanding of students' perspectives and preparedness of online assessments during the Covid-19 pandemic. The questionnaire comprised 25 questions: questions 1-5 acquired student demographics, questions 6-10 ascertained if students had the necessary resources for online assessments, questions 11-14 covered knowledge, questions 15-19 covered attitudes and questions 20-23 provided insight into practices regarding online assessments. These questions were closed questions requiring Likert scale format responses ranging from 1 (strongly agree), 2 (agree), 3 (not sure), 4 (disagree) to 5 (strongly disagree). Questions 24 and 25 of the questionnaire were designed to elicit qualitative data through open-ended questions which allowed respondents to report on any other challenges affecting online assessments and express themselves freely on how online assessments could be improved. The returned questionnaires were coded as P1 to P111 to maintain participant anonymity.

The quantitative data (closed-ended questions) obtained from the questionnaire were captured onto an Excel spreadsheet and analysed using SPSS version 27.0 (IBM Corp, USA). Data was analysed using univariate descriptive statistics such as frequency and mean distribution. An inferential statistical technique, the Pearson's chi-squared test, was used to investigate associations between the independent variable (year of study) and the dependent variables (preparedness, knowledge, attitudes and practices). A p-value of 0.05 or less was considered statistically significant.

The qualitative data obtained from closed-ended questions (24 and 25) were analysed using thematic analysis.14 The responses from each student were transcribed verbatim.

Two members of the research team independently coded the data, organised the data set into code groups and examined them further for familiar patterns and emergent themes. Then, both members compared their findings and finalised the main themes and sub-themes together. Confirmability was maintained through direct quotes of students' responses.14

 

RESULTS

Among the 156 students, 111 accessed the link to the survey, yielding a response rate of 71%. Most of the participants were female (n=72, 68%), registered for the Dental Therapy programme (97%). The response rates from each year were:

Year 1 (n=33, 30.8%), Year 2 (n=33, 30.8%), Year 3 (n=41, 38.3%). More than half of the study sample (n=56, 52%) indicated that they were residing at home, while a quarter (n=27, 25%) were living on campus residences and 21 respondents (20%) lived at private residences, away from home.

Results of quantitative data analysis Requirements for online assessments

The results showed most students (n=106, 99%) knew how to log onto the Moodle learning platform for assessments (Table 1).

Although the majority of the participants, across all three years of study, strongly agreed (n=33, 30.8%) and agreed (n=53, 49.5%) that they had data for continued access to online assessments, more than 60% (n= 66) agreed or strongly agreed that they had connectivity problems during an assessment. It was of great significance to note that only 11(33.3%) of first year, 9(27.3%) of second year and 7 (17.1%) of third-year students strongly agreed that they had a conducive work space at their place of residence during the pandemic to undertake online assessments.

Knowledge of online assessments

In this study, although almost 87% (n=93) of the study participants agreed or strongly agreed that they were familiar with the online assessment method, some were unsure about physically being on campus for an online assessment.

Attitudes towards online assessments

Only 66.5% of all study participants agreed or strongly agreed that they were confident when undertaking online assessments. The respondents were divided on whether they preferred online assessments to class-based assessments as 24.3% strongly agreed, 24.3% agreed and almost 20% disagreed or strongly disagreed, while 23.4% were unsure. Almost 62% (n=66) of participants agreed or strongly agreed that online assessment helped them grasp all aspects of theory. Linking theory into clinical practice is integral in final year for critical decision making and holistic patient management, yet only 9 (22%) of third-year participants strongly agreed and 13 (22.4%) agreed that online assessments helped them do this. Interestingly, it was observed that only 22 participants in first year (66%), 20 in second year (60.6%) and 24 in final year (58.5%) strongly agreed that they were honest when taking online assessments.

Practices of online assessments

Only 67.3% of all participants agreed or strongly agreed that they were able to easily navigate through an online assessment. Twenty-one participants from first year (63.6%), 14 from second year (42.4%) and 29 from third year (70.7%) agreed that they understood the structure or layout of online assessments. Among the second-year participants, 14 (42.4%) agreed that they were able to grasp the context of the questions in the English language of online tests, 13 (39.4%) strongly agreed while 5 (15.2%) were unsure. Only 45.8% of all participants agreed that they were able to complete the online assessment in the given time.

Results of qualitative data analysis

Three main themes emerged from the qualitative data obtained in response to the question on the challenges affecting online assessments (Table 2). The main themes included student issues, logistic issues and assessment issues (lecturer-based) with each theme having its own sub-themes. A large percentage of the respondents (33%) reported not having enough time to complete the online tests making this a basis for formulating it as an important sub-theme. Some (20%) felt strongly that face-to-face learning was the best mode of delivery of content and that the home was not a place to study or be assessed in.

 

DISCUSSION

While adapting to this paradigm shift to online assessments, it was important to gauge student readiness to respond to these changes in the learning environment. Most participants in this study (n= 101, 94.3%) agreed they had access to electronic devices in the form of laptops or cellphones for online assessments. However, from a resource perspective, the challenges that participants in this study experienced were the lack of continuously available data for assessments, irregular connectivity and poor network. While the institution made attempts to ensure the availability of data for all students,8 the timeous distribution of the data packages and the amount of data provided was challenging. The issues of inconsistent data availability and intermittent internet connectivity could have a serious impact on online assessments because students could have challenges completing their assessment tasks within a given time. More importantly, this can create a further divide in student learning where the learning environment and access to learning resources are inconsistent. These findings were consistent with similar studies conducted by other researchers15,16 who identified internet access and connectivity as the main challenges students encountered with online learning. Other authors have noted that not having internet access is a significant factor limiting the feasibility of online learning and assessments in a South African context.16,17

In this study, about 40% of participants did not have a conducive learning environment. This suggests that not having a conducive learning environment could impact negatively on a student's overall academic performance. This finding was similar to findings of a literature review conducted by Pokhrel and Chetri on the impact of the pandemic on teaching and learning where the authors reported having issues with physical workspaces conducive to learning.9

Although participants in this study were knowledgeable on the online assessment method, some were not aware that these could be done remotely and did not require physical presence in the campus setting. These findings imply a gap in student awareness of the value, role, structure and processes of online assessments. These findings were consistent with that of a study by Mpungose (2020), who found that adequate training on the use of online resources was required to keep all students well informed and to avoid confusion.18

This study findings showed that just more than 60% of the study sample expressed confidence in engaging with online assessments while 50% agreed that they preferred online assessments to traditional written tests. This finding is in contrast to other studies by Laine et al. (2016),19 who found that students were satisfied with online assessments, and Elmehdi and Ibrahem (2019),20 who showed that students had positive attitudes towards online examinations.

Moreover, about 60% only felt that online assessments helped them understand and grasp all theoretical knowledge and less than 25% of final-year students believed that these assessments contributed to the integration of theory to clinical practice. This finding reveals a theory-to-practice gap and suggests the need for further interrogation of other assessment methods. To overcome this, Rawlusyk (2018) proposed including more case-based scenarios in assessments to encourage a greater depth of learning from students and application of their understanding in real-world tasks or settings.21

Overall, about 60% of the study participants strongly agreed that they were honest when taking online assessments, which raises an important question of the reliability of this method of assessment. In a recent study by Cerimagic and Hasan (2019), it was observed that 81% of learners cheated or attempted to cheat during online assessments.22

Cheating during online assessment is a major issue and calls for stricter control measures to be implemented. Some researchers eg Bawarith et al. (2017) implemented an e-exam management system, which aimed to detect and prevent cheating in online assessments.23 However, Backman (2019) argued that academics could also take steps to reduce the occurrences of cheating.24 These could include the addition of more demanding questions (that focus on analysis, synthesis and evaluation), selection of random questions and the allocation of less time for completing the assessment.

Although the majority of the participants could grasp the layout of questions and understand the context of the questions, only 60% of them agreed that they could easily navigate through the online assessments. Navigating through an online assessment requires cognitive skills as well as technical and computer skills. This finding suggests that all students do not have the same level of computer skills and understanding of technology, and is similar to that observed by other authors who found that students were disadvantaged in terms of computer skills and technology, especially those from rural or remote areas.11,18 This could explain why participants in this study struggled with navigating an online assessment as most of them were at home for the academic year, with some residing in rural areas.

In this study, less than 50% of study participants could complete the test within the time given. The issue of insufficient time was a major concern for students, even highlighted in the qualitative data collected. This finding is similar to a study by Khalaf et al. (2020) who found that students struggled to complete multiple choice questions in a given time frame.25 Another challenge experienced was the format of certain questions that disadvantaged students due to technical problems and not being able to go back to correct answers. This finding was consistent with that of Sadeghi (2019) who noted interruptions or other system errors appearing during the course of an online assessment.26

A further important aspect that students reported was not being able to query a question which they did not understand as is the case with sit-down assessments on-site. This finding was consistent with a study by Hsiao & Watering (2020) who recommended that the online assessments should be made clear to students with procedures and expectations clearly explained and related examples or sample questions be given to students prior to the assessment.27 This is supported by Khalaf et al. (2020), where students believed mock tests were acceptable and helpful.25

Some of the recommendations made by the study participants included extending the time given for online tests, reviewing the types of questions in assessment tasks, appropriate scheduling of tests and setting up mock assessments. All of these factors should be carefully considered to produce efficient online assessments and consequently to gain students' acceptance and satisfaction of online assessments.

In one of the recommendations to improve online assessments, students expressed preference for only the multiple-choice format (MCQs) and not the "drag and drop" or "short answer" question type. However, Gülbahar (2017) argues that multiple-choice questions are only useful for acquiring basic information about learning.28 This is further supported by Struyven et al. (2005) who reported that the multiple-choice format discourages students from studying diligently for a test, as they perceive it would be easier to prepare for when the correct answers would be given anyway.29 This could paradoxically lead to students adopting a surface approach to learning rather than a deep approach.29 The surface approach to learning is described as memorising facts or reproduction of content (rote learning) in an assessment, whereas the deep approach requires meaningful engagement with the content to obtain a better understanding, so as to apply this knowledge gained in different contexts.30 Surface learning may be detrimental to a dental student as grasping fundamental knowledge gained in the first two years of study is required for clinical application in the final year. Therefore, other authentic assessment methods such as online assignments, case-based scenarios and clinical portfolios should be considered for a deeper assessment of learner performance in addition to assessments having only multiple-choice questions.

One suggestion for future teaching and learning practice is that online assessments be integrated into a menu of assessments used to measure student competency. Online assessments have the potential to test lower order thinking of remembering, understanding and applying as postulated by Bloom's taxonomy (Bloom 1956)31 and "knows" and "knows how" as proposed by Miller's Pyramid (Miller 1990)32. This iterates the value of online assessments, specifically when they are combined with other assessments such as clinical or oral assessments to evaluate overall student competence.

 

STUDY LIMITATIONS

This study has two imitations, the first being that it was conducted at a single training site, thus affecting the generalisability of the findings. Second, it was conducted only among students and not academics as well. Academics could have also experienced challenges with the sudden transition from traditional methods of assessments to online assessments. Therefore, further research is required to determine their perspectives as well.

 

CONCLUSION

This study demonstrated that undergraduate dental students were familiar with online assessments and were confident about taking them. They believed that this helped them grasp all aspects of theory despite specific challenges associated with the use of online assessments. This study suggests online assessments could be a valuable method in measuring student competency of fundamental theoretical aspects of dentistry.

Authors' contributions

Dr Ilana Moodley devised the project and the main conceptual ideas. Dr Rajeshree Moodley compiled and uploaded the survey instrument and collected the data. Dr Ilana Moodley and Prof Shenuka Singh were responsible for data analysis. Dr Ilana Moodley was responsible for drafting and writing of the manuscript. All authors participated in the interpretation of data and revision of the paper. All authors read and approved the final manuscript.

Acknowledgement

None

Declaration of interest

All three authors declare they have no conflict of interest.

 

REFERENCES

1. Krishnakumar B, Rana S. COVID 19 in INDIA: Strategies to combat from combination threat of life and livelihood. Journal of Microbiology, Immunology and Infection. 2020; 53(3): 389-91. doi: 10.1016/j.jmii.2020.03.024. Epub 2020 Mar 28. PMID: 32253143; PMCID: PMC7141550        [ Links ]

2. Bala L, van der Vleuten C, Freeman A, Torre D, et al. COVID-19 and programmatic assessment. Clinical Teacher. 2020; 17(4): 420-22. doi: 10.1111/tct.13207. Epub 2020 Jun 26. PMID: 32588965; PMCID: PMC7361442        [ Links ]

3. Guangul FM, Suhail AH, Khalit MI, Khidhir BA. Challenges of remote assessment in higher education in the context of COVID-19: a case study of Middle East College. Education Assessment Evaluation Account. 2020; 32(4): 519-35. doi: 10.1007/s11092-020-09340-w. Epub 2020 Oct 21. PMID: 33101539; PMCID: PMC7576099        [ Links ]

4. Baleni ZG. Online Formative Assessment in Higher Education: Its Pros and Cons. Electronic Journal of E-Learning. 2015; 13 (4): 228-36. https://www.learntechlib.org/p/160781/        [ Links ]

5. Deogade SC, Naitam D. Reflective learning in community-based dental education. Education Health Journal (Abingdon). 2016; 29(2): 119-23. doi: 10.4103/13576283.188752. PMID: 27549649        [ Links ]

6. Gerhard-Szep S, Güntsch A, Pospiech P, et al. Assessment formats in dental medicine: An overview. GMS Journal of Medical Education. 2016; 33(4): Doc 65. doi: 10.3205/zma001064. PMID: 27579365; PMCID: PMC5003142        [ Links ]

7. Eriksen HM, Bergdahl J, Bergdahl M. A patient-centred approach to teaching and learning in dental student clinical practice. European Journal of Dental Education. 2008; 12(3): 170-5. doi: 10.1111/j.1600-0579.2008.00518.x. PMID: 18666899        [ Links ]

8. University of KwaZulu-Natal. Portalhttp://utlo.ukzn.ac.za/utop/studentsresources.aspx Accessed 03/04/2021        [ Links ]

9. Pokhrel S, Chhetri R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. Higher Education for the Future. 2021; 8(1): 133-141. https://doi.org/10.1177/2347631120983481        [ Links ]

10. Williams JC, Baillie S, Rhind S, et al. A Guide to Assessment in Dental Education. Version 1. November 2015, University of Bristol. Creative Common Attribution 4.0 International License http://creativecommons.org/licenses/by/4.0/        [ Links ]

11. Oyedemi T, Mogano S. The Digitally Disadvantaged: Access to Digital Communication Technologies among First Year Students at a Rural South African University, Africa Education Review. 2018; 15(1) 175-191, DOI: 10.1080/18146627.2016.1264866        [ Links ]

12. Moodley R, Singh S, Moodley I. Undergraduate dental students' perspectives on teaching and learning during the COVID-19 pandemic: Results from an online survey conducted at a South African university using a mixed-methods approach. African Journal of Health Professions Education. 2022; 14(1): 26-33, DOI: 10.7196/AJHPE.2022.v14i1.1482        [ Links ]

13. Bhattacherjee A. Social Science Research: Principles, Methods, and Practices. 2012        [ Links ]

14. Braun V, Clarke V Using thematic analysis in psychology. Qualitative Research in Psychology. 2006; 3(2). 77-101. ISSN 1478-0887 Available from: http://eprints.uwe.ac.uk/11735        [ Links ]

15. Dube B. Rural Online Learning in the Context of COVID 19 in South Africa: Evoking an Inclusive Education Approach. Multidisciplinary Journal of Educational Research. 2020;10 (2): 135. doi:10.17583/remie.2020.5607        [ Links ]

16. Rafeek R, Sa B, Harnarayan P. Rapid Transition to Online Teaching During COVID 19: Students' and Teachers' Perceptions in A Pioneer Caribbean Dental School. 2020 doi:10.21203/rs.3.rs-84862/v1        [ Links ]

17. van Deursen AJ, van Dijk JA. The first-level digital divide shifts from inequalities in physical access to inequalities in material access. New Media & Society. 2019; 21(2): 354-75. https://doi.org/10.1177/1461444818797082        [ Links ]

18. Mpungose CB. Emergent transition from face-to-face to online learning in a South African University in the context of the Coronavirus pandemic. Humanities and Social Sciences Communications. 2020; 7:113 | https://doi.org/10.1057/s41599-020-00603-x        [ Links ]

19. Laine K, Sipila E, Anderson M, et al. Electronic exam in electronics studies. Paper presented at the SEFI Annual Conference 2016: Engineering Education on Top of the World: IndustryUniversityCooperation.Tampere,Finland.Retrieved;from:http://sefibenvwh.cluster023.hosting.ovh.net/wp-content/uploads/2017/09/laine-electronic-exam-in-eletronics-studies-9.pdf        [ Links ]

20. Elmehdi HM, Ibrahem AM. Online Summative Assessment and Its Impact on Students Academic Performance, Perception and Attitude towards Online Exams: University Of Sharjah Study Case. Creative Business and Social Innovations for a Sustainable Future. 2019; 211-18. Doi: 10.1007/978-3-030-01662-3_24        [ Links ]

21. Rawlusyk PE. Assessment in Higher Education and Student Learning. Journal of Instructional Pedagogies. 2018; 1: 1-34        [ Links ]

22. Cerimagic S, Rabiul Hasan M. Online Exam Vigilantes At Australian Universities: Student Academic Fraudulence and The Role of Universities To Counteract. Universal Journal of Educational Research. 2019; 4: 929-36. doi:10.13189/ujer.2019.070403        [ Links ]

23. Razan B, Basuhail A, Fattouh A. E-Exam Cheating Detection System. International Journal of Advanced Computer Science and Applications. 2017; 8 (4). doi:10.14569/ijacsa.2017.080425        [ Links ]

24. Backman J. Students' experiences of cheating in the online exam environment (Bachelor's thesis, Rectors' Conference of Finnish Universities of Applied Sciences). 2019 Retrieved from https://www.theseus.fi/bitstream/handle/10024/167963/Thesis.pdf?sequence=2&isAllowed=y        [ Links ]

25. Khaled K, El-Kishawi M, Adel Moufti, M,et al. Introducing A Comprehensive High-Stake Online Exam To Final-Year Dental Students During The COVID-19 Pandemic And Evaluation Of Its Effectiveness. Medical Education Online. 2020; 25(1): 1826861. doi:10.1080/10872981.2020.1826861        [ Links ]

26. Sadeghi M. A Shift from Classroom to Distance Learning: Advantages and Limitations. International Journal of Research in English Education. 2019; 4(1): 8088. doi:10.29252/ijree.4.1.80        [ Links ]

27. Hsiao YP, van der Watering GA. Guide for choosing a suitable method for remote assessment considerations and options: University of Twente. 2020        [ Links ]

28. Bahadir G. The Relationship between Work Engagement and Organizational Trust: A Study of Elementary School Teachers in Turkey. Journal of Education and Training Studies. 2020; 5(2): 149 Doi:10.11114/jets.v5i2.2052        [ Links ]

29. Struyven K, Dochy F, Janssens S. Students' Perceptions about Evaluation and Assessment in Higher Education: A Review1. Assessment & Evaluation in Higher Education. 2005; 30 (4): 325-341. Doi: 10.1080/02602930500099102        [ Links ]

30. Donnison S, Penn-Edwards S. Focusing On First Year Assessment: Surface or Deep Approaches To Learning? The International Journal of the First Year in Higher Education. 2012; 3 (2): 9-20. doi:10.5204/intjfyhe.v3i2.127        [ Links ]

31. Bloom BS. Taxonomy of Educational Objectives. Handbook I: Cognitive Domain. New York: David McKay, 1956        [ Links ]

32. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990; 65(9 Suppl): S63-7. https://doi.org/10.1097/00001888-199009000-00045        [ Links ]

 

 

Correspondence:
Dr Ilana Moodley
Tel: 084 920 1088
Email: moodleyil@ukzn.ac.za

 

 

Author's contribution
1. Dr Ilana Moodley: study conceptualisation, data analysis, manuscript preparation, writing - 60%
2. Prof Shenuka Singh: data analysis and final editing - 20%
3. Dr Rajeshree Moodley: data collection, data analysis, editing - 20%

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons