SciELO - Scientific Electronic Library Online

 
vol.30 número1Human rights values or cultural values? Pursuing values to maintain positive discipline in multicultural schoolsDisruptive behaviour in the Foundation Phase of schooling índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Journal of Education

versão On-line ISSN 2076-3433
versão impressa ISSN 0256-0100

S. Afr. j. educ. vol.30 no.1 Pretoria Jan. 2010

 

Evaluation of the effectiveness of the 360-credit National Professional Diploma in Education (NPDE) programme

 

 

David Ngid*; Patrick Sibaya; Duduzile Sibaya; Herbert Khuzwayo; Mncedisi Maphalala; Nkosinathi Ngwenya

 

 


ABSTRACT

We investigated the effectiveness of the 360-credit National Professional Diploma (NPDE) as a programme that is aimed at the upgrading of currently serving unqualified and under-qualified educators, with a view to improving the quality of teaching and learning in schools and Further Education and Training colleges. To this end, the National Professional Diploma in Education Effectiveness Scale (NPDEES) and Classroom Observation and Assessment Form (COAF) were used. The findings indicated that educators differed in the extent to which they regarded the 360-credit NPDE programme as effective. The findings also indicated that component 3 (competences relating to teaching and learning processes), component 1 (competences relating to fundamental learning) and component 4 (competences relating to the profession, the school and the community) were the best predictors of the effectiveness of the 360-credit NPDE programme. It was found that educators differed in the extent to which they performed during the classroom-based evaluation. Suggestions are made for measures to improve educators' performance in the classroom.

Keywords: classroom-based evaluation; competences; components; exit level outcomes; National Professional Diploma in Education


 

 

Introduction

A large number of currently serving educators are still in possession of old certificates and diplomas. Such qualifications include, among others, Higher Primary Teachers' Certificates (HPTC) and two-year Secondary Teachers' Diplomas (STD). In terms of the Employment of Educators Act 76 of 1998 (Republic of South Africa, 1998), educators with such qualifications are no longer fully qualified to teach because they do not possess the Relative Education Qualification Value (REQV 13), which is the current minimum professional qualification for educators. Many other educators are still in possession of Grade 12 certificates as their highest qualification but due to the shortage of educators they have, for a number of years, been allowed to teach without a teachers' professional qualification.

Considering that currently serving under-qualified educators who hold certificates or diplomas nested in the REQV 11 and 12 could not improve their qualification via the new framework for professional qualifications for educators in schools, as outlined in the Norms and Standards for Educators (Republic of South Africa, 2000:26), with some amendments in the Higher Education Qualifications Framework (HEQF) (Department of Education, 2007), the Department of Education introduced a 240-cedit NPDE programme, which provides such educators with an alternative access route into the new qualification framework by upgrading themselves to REQV 13. The 360credit NPDE was later introduced for the inclusion of currently serving unqualified educators with REQV 10 qualification to be upgraded also to REQV 13.

 

Conceptualisation of 360-credit NPDE

The original 240-credit NPDE, which was registered by the South African Qualifications Authority (SAQA) board in October 2000, only provides access to currently serving educators in schooling in the General Education and Training (GET) band classified as at REQV 11 and 12. There are, however, additional currently serving educators who need to be provided with such an access route. Consequently, the original NPDE has been revised. The new 360-credit NPDE provides access to the following categories of currently serving professional unqualified or under-qualified educators:

• Professionally under-qualified Grades R-12 educators classified as at REQV 11 or 12.

• Some professionally unqualified Grades R-12 educators classified as at REQV 10.

• Professionally unqualified or under-qualified Grade R educators or Grade R educators with professional qualifications not recognized by the Department of Education.

• Professionally unqualified or under-qualified educators of technical subjects in Further Education and Training (FET) Grades 10-12 (General), and FET college educators (SAQA, 2004:2).

The original NPDE was comprised of a 240-credit National Qualifications Framework (NQF) level 5 qualification. The current NPDE programme consists of a 360-credit (at NQF level 5) qualification. The entrance requirements for a 360-credit NPDE include at least five years of teaching experience and at least a NQF level 4 qualification, and finally, candidates should be serving educators at schools or technical colleges (SAQA, 2004:4). In-service educators, who are the main focus of this study, and who are classified as REQV 10, are credited with 120 credit points through assessment for recognition of prior learning (RPL), as prescribed in the NPDE policy document (SAQA, 2004:12). The remaining 240 credits are earned by attending formal lectures over a period of three years, on part-time basis.

The 360-credit NPDE is aimed at providing unqualified or under-qualified educators with the opportunity to become fully qualified professionals (REQV 13). This is currently the 'qualified educator' status in terms of the Employment of Educators Act 76 of 1998. It is assumed that the NPDE will be practice-based, have a strong classroom focus and equip educators with the foundational, practical and reflective competencies required for further study at NQF level 6.

The 360-credit NPDE has exit level outcomes that are grouped into four components, and which together reflect the work of a professional educator as regulated in the Policy on Norms and Standards for Educators (Republic of South Africa, 2000). These components are outlined in the NPDE policy document (SAQA, 2004) as follows:

• Component 1: Competences relating to fundamental learning.

• Component 2: Competences relating to the subject and content of teaching.

• Component 3: Competences relating to teaching and learning processes.

• Component 4: Competences relating to the profession, the school and the community.

The exit level outcomes for component 1 are:

• Exit level outcome 1.1 (Literacy): Candidates demonstrate competence in reading, writing and speaking the language/s of instruction in ways that facilitate their own academic learning and their ability to facilitate learning in their classrooms.

• Exit level outcome 1.2 (Numeracy): Candidates demonstrate competence in interpreting and using numerical and elementary statistical information to facilitate their own academic learning and their ability to administer teaching, learning and assessment.

The exit level outcomes for component 2 are:

• Exit level outcome 2.1: Candidates demonstrate competence with regard to the knowledge base underpinning the learning areas or subjects they will be teaching.

• Exit level outcome 2.2: In their area(s) of specialisation (phase and subject/learning area), candidates demonstrate competence in planning, designing and reflecting on learning programmes as prescribed in the current national policy, which is appropriate for their learners and learning context.

The exit level outcomes for component 3

• Exit level outcome 3.1: In their area of specialisation, candidates demonstrate competence in selecting, using and adjusting teaching and learning strategies in ways that meet the needs of the learners and the context.

• Exit level outcome 3.2: Candidates demonstrate competence in managing and administering their learning environments and learners in ways that are sensitive, stimulating, inclusive, democratic and well organised.

• Exit level outcome 3.3: Candidates demonstrate competence in monitoring and assessing learner progress and achievement in their specialisation.

The exit level outcomes for component 4 are:

• Exit level outcome 4.1: Candidates demonstrate a respect for and commitment to the educator profession.

• Exit level outcome 4.2: Candidates demonstrate knowledge of the core values of South African education, and competence in, and commitment to, dealing with the effects of the HIV and AIDS pandemic on and in the education system.

• Exit level outcome 4.3: Candidates demonstrate a capacity to function responsibly within the education system, a learning institution, and the community in which the institution is located.

Evidence of competences demonstrated by candidates in each exit level outcome is outlined in the NPDE policy document (SAQA, 2004).

 

Conceptualisation of classroom-based evaluation

The NPDE is practice-based and has a strong classroom focus to equip educators with the foundational, practical and reflective competencies required for further study at NQF level 6 (SAQA, 2004:5). The NPDE, therefore, requires evidence that educators demonstrate applied competences of exit level outcomes in the classroom setting. Classroom-based evaluation is conducted during school visits. Its purpose is to observe NPDE educators while presenting their lessons in order to assess their application of the expected applied competences derived from both formal lectures and RPL sessions. It also involves providing feedback and classroom support where it is needed.

 

Problem statement

The Department of Education in South Africa is spending a lot of money on the upgrading of the unqualified and under-qualified educators and yet very few studies have been conducted on the effectiveness of the 360-credit NPDE as a programme for upgrading educators. Studies conducted on the NPDE programme in South Africa dealt with learner assessment (Mothata, Van Niekerk & Mays, 2003); recognition of prior learning (Moll & Welsh, 2003), and the effectiveness of the competences of the 240-credit NPDE programme (Ngidi, 2005). Very few, if any, studies have attempted to investigate the effectiveness of the revised 360-credit NPDE programme, particularly in helping educators to achieve the expected competences. We intend to do that. More specifically, we attempt to find answers to the following research questions:

• To what extent do educators regard the 360-credit NPDE programme to be effective?

• Do educators' biographical variables have an influence on their evaluation of the effectiveness of the 360-credit NPDE programme?

• Is/are there any component(s) that account(s) more effectively for the effectiveness of the 360-credit NPDE programme?

• To what extent do educators perform during the classroom-based evaluation in the 360-credit NPDE programme?

 

Method

Aims of research

The study was aimed at achieving the following objectives:

• To ascertain the extent to which educators regard the 360-credit NPDE programme to be effective.

• To determine whether educators' biographical variables have any influence on their evaluation of the effectiveness of the 360-credit NPDE programme.

• To determine which component(s) account(s) more effectively for the effectiveness of the 360-credit NPDE programme.

• To ascertain the extent to which educators differ in performance during their classroom-based evaluation.

Hypotheses

The following theoretical hypotheses were formulated:

• Educators do not differ in the extent to which they regard the 360-credit NPDE programme to be effective.

• Educators' biographical variables (gender, teaching experience and teaching phase) have no influence on their evaluation of the effectiveness of the 360-credit NPDE programme.

• No component(s) account(s) more effectively for the effectiveness of the 360-credit NPDE programme.

• Educators do not differ in the extent to which they perform during the classroom-based evaluation.

Participants

Participants for this study were drawn from REQV 10 educators who were registered part-time for the 360-credit NPDE with the University of Zululand and who were in their second year of the three-year programme. The participants had volunteered to participate in the study as part of research ethics since it is unethical to force people against their will to be involved in research (Struwig & Stead, 2001:66). This was done in accordance with an accidental non-probability sampling design (Table 1).

 

 

Table 1 shows the distribution of participants according to their biographical variables. Out of the 158 REQV 10 educators who were involved in the 360-credit NPDE programme, 108 completed the questionnaire. However, during the classroom-based evaluation 140 educators were evaluated. Others, for various reasons, were not evaluated. The fact that 108 educators completed the questionnaire and 140 were evaluated during the classroom-based evaluation would not affect the results of the study because they were derived from two different research instruments and they were analysed and reported separately and independent of each other.

Measures

Data collection was done in two different stages, using two different research instruments. In the first stage, data in connection with the NPDE students were collected towards the end of the second year using the questionnaire as a research instrument. In the second stage, data were collected during school visits using classroom observation and an assessment form as a research instrument. The questionnaire was appropriate for rating educators' responses regarding the extent to which they felt that the NPDE programme helped them to achieve the listed competences. Classroom observation and assessment form was appropriate for evaluating educators' demonstration of applied competences in the classroom. Both instruments were amenable to quantitative data analysis.

The questionnaire consisted of two sections covering the aims of the study. The first section consisted of educator's biographical information, namely, gender, teaching experience, and teaching phase. The second section consisted of the National Professional Diploma in Education Effectiveness Scale (NPDEES).

National Professional Diploma in Education Effectiveness Scale (NPDEES)

Informed by the competences that educators are expected to demonstrate in the NPDE programme, the researchers developed the National Professional Diploma in Education Effectiveness Scale (NPDEES) by phrasing the competences in the form of a statement. This is a five-point scale. Respondents were asked to indicate the effectiveness of the NPDE programme in helping them achieve the expected competences listed. The ratings were: not effective (1), slightly effective (2), effective (3), very effective (4), and extremely effective (5). The researchers also established the reliability of the NPDEES themselves. The overall internal-consistency reliability for the NPDEES in this study, using Cronbach's alpha coefficient, was 0.98, which is proof of high reliability (Pieterson & Maree, 2007).

The NPDEES consists of 78 items. The lowest possible score on this scale is 78 × 1 = 78 and the highest possible score is 78 × 5 = 390. In order to establish "low effective level category", "moderate effective level category" and "high effective level category", this continuum of 78-390 was arbitrarily divided into three categories, namely: 78-182, indicating a low effective level (LEL); 183-286, showing a moderate effective level (MEL); and 287-390, showing a high effective level (HEL). Each respondent's summated score was, therefore, classified accordingly into one of these categories. This procedure yielded data to fulfil the first aim. Data obtained through this procedure were also used together with those of the educators' biographical data in order to meet the second aim of the study.

Total scores of the respondents on the NPDEES and on each of the four components were used to meet the third aim of the study. The internal consistency reliability for components 1, 2, 3, and 4 in this study, using Cronbach's alpha coefficient, yielded scores of 0.90, 0.91, 0.96, and 0.93, respectively. An instrument with a coefficient alpha measure or reliability estimate of 0.70 is regarded as internally consistent and satisfactory (Nunnaly & Bernstein, 1994; Muijs, 2004). The internal consistency reliability correlation coefficients obtained were therefore high (Pieterson & Maree, 2007).

Classroom Observation and Assessment Form (COAF)

The COAF was developed by the researchers and its validity was determined by means of both content and face validity. This was done by the researchers through consultation with experienced lecturers involved in the teaching of the NPDE programme. The COAF consists of applied competences that educators could be expected to be able to demonstrate in the classroom. In this study, the ratings on each competence ranged from poor to excellent, but the overall mark was awarded for each individual educator's performance. The study relied on a conventional scale for categorising educators according to their performance, namely: 0-49: indicating a failure (F); 50-64: a pass (P); 65-74: first class (FC); and 75-100: indicating a pass with distinction (D). The respondents' summated scores were classified according to these categories. This procedure yielded data to fulfil the fourth aim.

Procedures

After piloting the questionnaires with twenty educators who would not form part of the main study, the researchers personally distributed the questionnaires to the participants in schools. An explanation of the nature of the questionnaire and the purpose of the investigation preceded the completion of the questionnaires. The questionnaires were collected after two weeks. Each educator's performance in the classroom was observed by the researchers and evaluated by means of the assessment form during the school visits.

In order to achieve the aims of this study, various statistical procedures were followed. The chi-square one-sample test (Behr, 1988) was used to ascertain the extent to which educators generally regarded the NPDE programme to be effective (aim number one) as well as to determine the extent to which educators differed in their performance during the classroom-based evaluation (aim number four). The chi-square test of independence (Harris, 1995) was used to determine whether educators' biographical variables (gender, teaching experience, and teaching phase) had any influence on their evaluation of the effectiveness of the NPDE programme (aim number two). The chisquare test is considered appropriate for categorical data (Orlich, 1978; Borg & Gall, 1983; Behr, 1988; Bless & Kathura, 1993; Harris, 1995; Babbie & Mouton, 2001; Goddard & Melville, 2001).

The third aim of this study was to determine which component(s) contribute(s) most to the effectiveness of the NPDE programme. To this end, a stepwise regression analysis was used. Stepwise regression analysis is typically used to determine the independent variables that are useful in predicting the dependent variable. The computer program (SPSS 16.0 for Windows) searches for the order in which the best predictor variables (independent variables) are to be entered into the regression analysis. Hence, in regression there are several variables on one side of the equation and one variable on the other side (Borg & Gall, 1983; Tabachnick & Fidell, 1989; Pieterson & Maree, 2007). In this study, the total of the NPDEES was used as the dependent variable whilst components 1, 2, 3, and 4 were used as predictor variables (independent variables).

 

Results

The chi-square test (χ2 = 48.222; df = 2; p < 0.05) indicated that significant difference was found among the low effective level (LEL), moderate effective level (MEL) and high effective level (HEL) groups (Table 2). This finding showed that educators differed in the extent to which they regarded the 360-credit NPDE programme to be effective in helping them achieve the expected competences. The results of analysis show that the differences amongst the three groups were not due to chance factors. Therefore, the null hypothesis was rejected.

 

 

Table 3 shows that no significant difference exists between males' and females' views on the effectiveness levels of the 360-credit NPDE programme. This finding showed that gender had no influence on educators' evaluation of the effectiveness of the 360-credit NPDE programme. What we see is an apparent difference, but not a statistical difference. Therefore, the null hypothesis was not rejected.

 

 

Table 4 indicates no significant differences among the categories 5-9, 10-14, 15-19, 20-above 20 years of teaching experience with regard to the effectiveness of the 360-credit NPDE programme. This finding showed that teaching experience had no influence on educators' evaluation of the effectiveness of the 360-credit NPDE programme. Any teaching experience-related differences pertaining to the three effectiveness levels were due to chance factors. Therefore, the null hypothesis was not rejected.

 

 

Table 5 indicates no significant difference among the Foundation Phase, Intermediate Phase, and Senior/FET Phase groups of educators. This finding showed that teaching phase had no influence on educators' evaluation of the effectiveness of the 360-credit NPDE programme. Any teaching phase-related differences pertaining to the three effectiveness levels were due to chance factors. Therefore, the null hypothesis was not rejected.

 

 

Table 6 shows that component 3 (competences related to teaching and learning), component 1 (competences related to fundamental learning) and component 4 (competences related to the profession, the school and the community) emerged as significant predictors of the effectiveness of the 360-credit NPDE programme (F = 2078.173; p < 0.00, F = 3321.437; p < 0.00 and F = 8205; p < 0.00), respectively. Component 3 explained the largest proportion of the variance, namely, 95%, whilst component 3 and component 1 explained 98% and components 3, 1 and 4 explained 99% of the variances. Component 2 (competences relating to the subject and content of teaching) could therefore not predict the effectiveness of the 360-credit NPDE programme. Therefore, the null hypothesis that no component(s) account(s) more effectively for the effectiveness of the 360-NPDE programme was rejected.

 

 

The chi-square test (χ2 = 89.543; df = 3; p< 0.05) indicated that a significant difference was found among the four groups of educators, namely, fail (F), pass (P), first class (FC) and distinction (D) groups (Table 7). This finding showed that the educators differed in the extent to which they performed during the classroom-based evaluation. The four groups' performances differ among themselves. Therefore, the null hypothesis was rejected.

 

 

Discussion

The findings revealed that educators differed in the extent to which they regarded the 360-credit NPDE programme to be effective. A relatively higher percentage of educators (50%) reported a high level of effectiveness compared to those who reported a moderate level (48%) and those who reported a low effectiveness level (2%) (Table 2). These findings are in accordance with those of Ngidi (2005). However, 70% and 30% of the educators, respectively, reported high and moderate effectiveness levels in the study conducted with regard to the 240-credit NPDE programme, while none of them reported a low effectiveness level (Ngidi, 2005).

The reason for the difference in the reported effectiveness levels for the 360-credit NPDE programme and the 240-credit NPDE programme may be as follows: firstly, in the former, the study was conducted in the second year of the three-year programme, before all the modules were taught, whereas in the latter case the study was conducted in the final year, when all modules had been completed. Secondly, the 240-credit group already had a professional qualification whereas the 360-credit group had not.

The findings also indicated that educators' biographical variables, namely, gender, teaching experience, and teaching phase had no influence on their evaluation of the effectiveness of the 360-credit NPDE programme. This implies that educators' general evaluation of the effectiveness of the 360-credit NPDE programme applied to all categories of educators, regardless of biographical variables. These findings are in accordance with those reported in a similar study by (Ngidi, 2005), which concluded that gender, teaching phase, and qualifications had no influence on the respondents' evaluation of the effectiveness of the 240-credit NPDE programme.

The findings furthermore indicated that component 3 (which entails competences relating to teaching and learning), component 1 (which entails competences relating to fundamental learning) and component 4 (which entails competences relating to the profession, the school and the community) were the best predictors of the effectiveness of the 360-credit NPDE programme. The implication is that these components can be used to predict the effectiveness of the 360-credit NPDE programme. These findings are contrary to those reported in a similar study (Ngidi, 2005), which revealed that component 2 (which entails competences relating to the subject and content of teaching) and component 1 were the only two best predictors of the 240-credit NPDE programme. In his study (Ngidi, 2005) found that component 3 and component 4 could not predict the effectiveness of the 240-credit NPDE programme.

Lastly, the findings revealed that educators differed in the extent to which they performed during the classroom-based evaluation. A higher percentage of educators (53.6%) passed compared to those who passed with first class (32.9%), those who passed with distinction (12.1%) and those who failed (1.4%). These findings concur with those of Ngidi (2005) in showing that a higher percentage of educators tend to pass during the classroom-based evaluation than those who fail. However, the fact that the majority of them passed does not mean that they did not experience deficiencies in the classroom evaluation. The mere fact that a very low percentage of them gained a first-class pass while an even lower number managed to obtain distinctions bears testimony to this conclusion.

 

Conclusion

According to the findings of this study, which indicated that a relatively higher percentage of educators reported a high level of effectiveness of the programme and also passed during classroom-based evaluation, it may be deduced that educators were satisfied with the assistance that the 360-credit NPDE programme offered them. They had apparently been equipped with the competences they needed in their teaching as a career. However, despite the relatively high number of passes in the classroom evaluation, the overall outcome of the study suggests that the educators need further assistance. This conclusion is strengthened by the low percentage in first class passes, an even lower achievement in terms of distinctions, together with the fact that a small number failed the evaluation. This provides evidence that effective supervision, continuous support and guidance from trained mentors at their schools of location may play a major role in this regard. Another solution may be ongoing workshops focusing on demonstration of applied competences in the classroom. It is proposed that further similar research should be conducted at the end of the programme so that more light can be shed on the findings.

 

Acknowledgements

We acknowledge the financial support from Finland, as part of the South Africa-Finland Co-operative Programme, and from the Higher Education Quality Committee (HEQC).

 

References

Babbie E & Mouton J 2001. The practice of social research. Cape Town: Oxford University Press.         [ Links ]

Behr AL 1988. Empirical research methods for the human sciences. Durban: Butterworth.         [ Links ]

Bless C & Kathuria R 1993. Fundamentals of Social Statistics: An African perspective. Cape Town: Juta & Co. Ltd         [ Links ]

Borg WR & Gall MD 1983. Educational research. New York: Longman.         [ Links ]

Department of Education 2007. The Higher Education Qualifications Framework. Government Notice No. 30353. Pretoria: Government Printer.         [ Links ]

Goddard W & Melville S 2001. Research methodology: An introduction, 2nd edn. Landsdowne: Juta.         [ Links ]

Harris MB 1995. Basic statistics for behavioural science research. Boston: Allyn & Bacon.         [ Links ]

Moll I & Welsh T 2003. RPL in teacher education: Lessons being learned from the National Professional Diploma in Education. Paper presented at the 2nd National RPL conference, 28-30 July, Pretoria.         [ Links ]

Mothata S, Van Niekerk L & Mays T 2003. Learner assessment in practice: Lessons from the NPDE. Perspectives in Education, 21:81-98.         [ Links ]

Muijs D 2004. Doing quantitative research in education with SPSS. London: Sage Publications.         [ Links ]

Ngidi DP 2005. Evaluation of the effectiveness of the competences of the NPDE programme. South African Journal of Education, 25:34-37.         [ Links ]

Nunnaly JC & Bernstein IH 1994. Psychometric theory, 3rd edn. New York: McGraw-Hill.         [ Links ]

Orlich DC 1978. Designing sensible surveys. New York: Redgrave Publishing Company.         [ Links ]

Pieterson J & Maree K 2007. Standardisation of a questionnaire. In: D Maree (ed.). First steps in research. Pretoria: Van Schaik Publishers.         [ Links ]

Republic of South Africa 2000. Norms and Standards for Educators. Government Gazette, Vol. 415 No. 20844. Pretoria: Government Printer.         [ Links ]

Republic of South Africa 1998. Employment of Educators Act No.76 of 1998. Government Gazette, Vol. 400 No. 19320. Cape Town: Government Printer.         [ Links ]

South African Qualifications Authority (SAQA) 2004. The National Professional Diploma in Education. Pretoria: Government Printer.         [ Links ]

Struwing FW & Stead GB 2001. Planning, designing and reporting research. Cape Town: Maskew Miller Longman.         [ Links ]

Tabachnick BG & Fidel LS 1989. Using multivariate statistics. New York: Harper & Row.         [ Links ]

University of Zululand Strategic Plan for research development 2007. KwaDlangezwa: University of Zululand.         [ Links ]

 

 

Authors
David Ngidi is Associate Professor in Educational Psychology, Head of the Department of Curriculum and Instructional Studies, and Vice-Dean (Research and Community Engagement) at the University of Zululand. His research areas include psychology of teaching and learning, attitudes, personality, curriculum, and teacher education.
Patrick Sibaya is Deputy Vice-Chancellor and former Professor and Head of the Department of Educational Psychology at the University of Zululand. His research interests focus on attitudes, learning disabilities, learning processes and problem solving. He is a registered psychologist.
Duduzile Sibaya is Acting Head of the Department of Mathematics, Science and Technology at the University of Zululand. Her research focuses on mathematics, teaching and curricular development.
Hebert Khuzwayo is Lecturer in the Department of Mathematics, Science and Technology at the University of Zululand and has taught mathematics for the past 15 years. His research interest is mathematics education.
Mncedisi Maphalala is Lecturer in the Department of Curriculum and Instructional Studies at the University of Zululand. His research interests are curriculum, assessment, and environmental education.
Nkosinathi Ngwenya is Lecturer in the Department of Mathematics, Science and Technology at the University of Zululand. He has been a co-ordinator for the National Professional Diploma in Education (NPDE) since 2002 and has taught science for the past 14 years.

 

 

*dngidi@pan.uzulu.ac.za

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons