SciELO - Scientific Electronic Library Online

 
vol.61 número1Background to draft SANS 10160 (2009): part 4 seismic loadingPartial factors for selected reinforced concrete members: background to a revisionof SANS 10100-1 índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


Journal of the South African Institution of Civil Engineering

versão On-line ISSN 2309-8775
versão impressa ISSN 1021-2019

J. S. Afr. Inst. Civ. Eng. vol.61 no.1 Midrand Mar. 2019

http://dx.doi.org/10.17159/2309-8775/2019/v61n1a4 

TECHNICAL PAPER

 

Eastern Cape employers' views on the strengths and weaknesses of civil engineering diplomates entering the workplace

 

 

F Gerber; K J Mammen

Correspondence

 

 


ABSTRACT

Employer perceptions in the Eastern Cape Province of South Africa have been reviewed to determine whether universities within the province produce civil engineering diplomates who meet industry's expectations. A questionnaire, which contained 30 questions (closed and open-ended), was designed to allow employers to assess the strengths and weaknesses of the students' academic training. These were distributed to a purposive sample of 546 employers using a web-based survey approach. Sixty-five percent of the targeted population who opened the invitations populated the questionnaire. The survey yielded an overall response rate of 28% (based on invitations sent). The data gathered was statistically analysed. The results indicate that employers deem the competencies of diplomates to be "neither high nor low", when employers are considered as a single, comprehensive cohort. Within employer representative subgroups statistically significant differences were, however, observed. Of the competencies assessed, "individual and teamwork ability" received the highest and "engineering design ability" the lowest rating. Overall, the results indicate that employers within the Eastern Cape Province are reasonably satisfied with the competencies of newly qualified diplomates, but point to the need for the strengthening of "engineering design ability", "professional and technical communication ability" and "problem-solving ability" competencies within the diploma programme curriculum.

Keywords: employer survey, engineering competencies, graduate attributes, engineering education, civil engineering diploma


 

 

INTRODUCTION AND BACKGROUND

The role of technology in the economy has resulted in engineering education becoming an important area of research. Both universities and employers have a desire to develop engineering graduates and diplomates possessing the competencies required by employers for a competitive and technology-driven global environment (Klein-Gardner & Walker 2011; Singer et al 2012; Besterfield-Sacre et al 2014). Government values the role of engineering professionals who contribute to the social, economic and human upliftment of South Africa (The Presidency of South Africa 2010; Taylor 2015). A study by the Engineering Council of South Africa (ECSA 2010) indicates that there is a severe shortage of engineering professionals (engineers, technologists and technicians) per capita of the population in South Africa, in comparison with other developed countries (Lawless 2007). There are approximately 1.4 technicians for every engineer in South Africa (Du Toit & Roodt 2008; Wolff 2017). Du Toit and Roodt (2008) go on to state that, for developed countries, the ideal ratio would be four technicians to two technologists for every one engineer, and that both ECSA and the Engineering Association of South Africa (ECSA & EASA 1995) deem this ratio to be four technicians per one technologist and one engineer. A lack of civil engineering graduates and diplomates have been equated as a contributing reason for ongoing poor service delivery at municipal level (Lawless 2007; Du Toit & Roodt 2008; Lawless 2011; Watermeyer & Pillay 2012). It is, therefore, not surprising to find that the civil engineering technician is amongst the occupations highest in demand in South Africa (Department of Higher Education and Training 2016).

Increasing the output of civil engineering graduates and diplomates who possess the contemporary sets of knowledge, skills and attitudes to respond to the modern-day workplace and a competitive economic environment is, therefore, an important educational objective in South Africa (Lawless 2007; Lawless 2011; Watermeyer & Pillay 2012; Taylor 2015, Naicker 2017). Engineering faculties in South Africa, while striving to increase their output, are being placed under increased scrutiny from industry to determine whether their graduates and diplomates have the range and level of competencies required in the workplace. Studies conducted by the Council for the Built Environment (CBE 2014) and the online survey study by the South African Institution of Civil Engineering (SAICE 2016) reflect the desire to better understand the perceived gap between graduate and diplomate competencies and the expectations of industry. The findings of the CBE study were that graduates were not "work ready" and that they lacked the skills and attributes required by employers. The report also recommended that additional research be conducted to establish the "competence situation" within the specific context of the industry (CBE 2014). The SAICE survey, in turn, sought the views of industry regarding the quality of the programmes offered by the various engineering departments at both universities and universities of technology, as well as the extent of employment offered to graduates and diplomates from the respective institutions. SAICE's 2017 president, Sundran Naicker, deduced that the survey findings indicated that tertiary level graduates and diplomates from certain institutions were considered to be better qualified than their counterparts from other institutions (Naicker 2017).

South African regulatory framework and its implications for engineering education

The Higher Education Act of 1997 assigns overall responsibility for quality assurance in higher education to the Council on Higher Education (CHE), through its permanent sub-committee, the Higher Education Quality Committee (HEQC). Criteria 18 and 19 of the HEQC's Handbook on Programme Accreditation Criteria specify, amongst others, that surveys, reviews and impact studies on the effectiveness of all programmes need to be conducted, and that the results need to be used to improve the programmes' design and delivery (CHE 2012). In 2006, the CHE and HEQC entered into a Memorandum of Understanding (MoU) with ECSA, whereby ECSA was delegated to undertake the quality assurance functions of the HEQC with regard to the undergraduate engineering programmes. ECSA, therefore, has the statutory responsibility of accrediting the engineering programmes offered at undergraduate level and conducts regular accreditation evaluation visits to institutions that offer engineering programmes. One of the reasons for this, as cited in the ECSA policy on accreditation inspections, is to establish "... whether the Diplomates and Graduates from the respective programmes are ready for employment ..." (ECSA 2013: 5).

Accreditation criteria consider the structure, learning outcomes, educational process, resourcing and sustainability of the programme to determine if the quality of the programme is adequate. This means, inter alia, that both the programme and the work performed by the students must meet the specified ECSA Exit Level Outcomes (ELOs) applicable for that programme. ECSA has developed quality assurance documents that contain standards, criteria, policies and procedures that the education provider must comply with in order to secure accreditation (ECSA 2013). These quality assurance documents include a Higher Education Qualifications Sub-Framework (HEQSF)-compliant Diploma qualification in engineering (ECSA 2016). It is, therefore, clear that the HEQC and ECSA, in fulfilling their functions as accreditation bodies, require the curriculum and outcomes of engineering undergraduate programmes to be suitably aligned to the competency requirements and expectations of employers. The importance of accreditation, as an independent quality assurance process, cannot be underestimated. It is, after all, there to assure the public of the quality of these programmes (ECSA 2013: 5), even though some of the engineering programmes only achieve full accreditation after multiple ECSA visits, this being as a result of deficiencies not being adequately addressed. In some instances programme deficiencies are only resolved once the professional body notifies the institution of its intention to withdraw programme accreditation. It is, therefore, questionable whether the awarding of accreditation for a programme is, indeed, sufficient for employers to be assured that graduates and diplomates are both ready for the workplace and sufficiently equipped to continue with their learning within the profession. Ballim et al (2014) point out that both the curriculum and the teaching and learning process at a university could be compromised if meeting the requirements of an external instrument, such as accreditation, becomes the primary focus of teaching. ECSA's accreditation policy states that universities have an obligation to "assess the impact of the programme and to show how the results are used to improve the programme" (ECSA 2014: 8). There is, however, no stipulation in the ECSA document as to how the impact of the programme must be assessed. Whilst employer representatives give direct input on the programme during the accreditation process, universities often fail to provide evidence on the impact of their programmes, despite this being ECSA and HEQC requirements (CHE 2012; ECSA 2014). Providing information on the impact of these programmes is particularly important to historically disadvantaged institutions, as it is mostly programmes from these institutions that are criticised for the poor quality of tuition (Cape Higher Education Consortium (CHEC 2013)), resulting in the perception that graduates and diplomates from these institutions are the least preferred candidates when seeking employment in industry (Naicker 2017).

The aim of this study was to gather and assess the perceptions of civil engineering employer representatives on the competencies of civil engineering diplomates entering the workplace from the two universities in the Eastern Cape Province which offer the Diploma in Civil Engineering. Employer studies are considered useful instruments that can contribute to understanding the links between learning, competencies and job requirements. This paper makes a contribution to the body of knowledge on engineering education, given the limited research that has been conducted on employers' perceptions regarding the readiness of civil engineering diplomates entering the workplace The empirical data collected and the results will assist universities and employers to develop diplomates who possess the competencies required for a competitive and technologically driven global environment. These results will enhance universities' awareness of public expectations for higher education diplomates to be more directly prepared for the workplace.

 

LITERATURE REVIEW

Employer surveys have been widely used by universities as part of the quality assurance processes to identify and address possible deficits within their educational programmes (Allahverdi & Aldowaisan 2015; Elrod et al 2015; Gwyn & Gupta 2015). Other international studies which explored how well graduate competencies and industry needs were aligned included both publications and articles, as was reported by Neilsen (2000) in Australia, Besterfield-Sacre et al (2014) in the United States, Ridgman and Liu (2014) in the United Kingdom and China, Peng et al (2016) in China, and Vadivu et al (2016) in India.

Walther and Radcliffe (2007: 42) state that "... competence can be conceptualised as an iceberg where the skill and knowledge domain form the tip, visible above the waterline, and traits, self-conception and motives make up the base." They argue that universities concentrate on the parts of the competency iceberg that are above the waterline, whilst companies tend to focus on the parts of the competency iceberg below the waterline. Panels of experts tend to define graduate attributes for engineering programmes as broad aspirational goals that only point in a general direction, whereas industry, in contrast, has a more detailed competency profile for each particular position in the organisation. This study draws from Holland's theory of vocational behaviour, which suggests that each environment, whether it be a work environment or an academic discipline, has a distinctive pattern of competencies, values, attitudes, interests, and self-perceptions. The importance of understanding what the competencies are that are valued in the workplace is highlighted in the studies undertaken by Riordan and Goodman (2007), Passow (2012), and Passow and Passow (2017).

Additional research conducted in South Africa, and which overlaps with this study, includes that of Griesel and Parker (2009) who undertook a baseline study of South African graduates from the perspective of the employer. Their findings indicated, amongst others, that there was a real need to address the disparity between employer expectations and higher education outcomes. Ngetich and Moll (2013) focused on the relationship between industry and newly graduated engineers, and examined the effectiveness and efficiency of graduates in the workplace. Hauschildt and Brown (2013), in their study, described a competence diagnostics project that focused on assessing the competence of students who had completed their vocational engineering qualifications. Kraak and Du Toit (see CHEC 2013) conducted a study on behalf of the Cape Higher Education Consortium in order to determine the levels of "graduate employment and unemployment" as well as identify the different pathways from higher education into the world of work (CHEC 2013).

All of the above studies point to an increased public expectation for higher education graduates and diplomates to be better prepared for the world of work. Employer studies are, therefore, useful instruments which can contribute to understanding the links between learning, competencies and job requirements (Teichler & Höhle 2013). This study was initiated in an attempt to explore these links. An overview of these studies indicates that employer surveys have been widely used as part of the quality assuranci process at universities in order to identify and address possible deficits within their educational programmes, and to determine how well graduate competencies and the needs of industry are aligned. In terms of competencies, universities concentrate on the skill and knowledge domain, while companies tend to focus on traits, self-conception and motives. South African studies on employers' perspectives indicated, amongst others, that there is a real need to understand and address the gaps between employer expectations and higher education outcomes, so that graduates and diplomates are more effective and efficient when they enter the workplace.

 

METHODOLOGY

This section discusses the methods and techniques used to collect and analyse the data for the study, i.e. the sample population, data collection procedures and data analysis instruments.

Sampling

Purposive sampling was adopted to elicit responses from members of the three Eastern Cape branches of SAICE, as well as members from the two branches of the Institute of Municipal Engineering of Southern Africa (IMESA) in the Eastern Cape Province. The selection was based on the experience and professional profile information as contained in the branch member databases of SAICE and IMESA, respectively. A total of 546 employer representatives were selected for the sample.

Instrument

A descriptive and evaluative questionnaire was developed as the data collection instrument. The questionnaire was created using Survey Monkey (http://www.surveymonkey.com). The instrument had a total of 30 questions, consisting of multiple choice, Likert scale and open-ended questions. It included questions on the regional/geographical location of the respondent, the type and size of organisation, academic qualifications, professional status, field(s) of specialisation, job designation, gender, age, years of experience and race. The invitation, which was sent to the participants, included both an information page and a link to the online survey. The respondents were asked to rate the competencies of civil engineering diplomates, who had qualified from Eastern Cape tertiary institutions, when entering the workplace.

Piloting

A pilot study was conducted with five experienced employer representatives. Following the completion of the questionnaires, a face to face discussion was conducted with the pilot study participants to obtain verbal feedback. The data collected was used to enhance the face and construct the validity of the questionnaire.

Data collection

The questionnaire was distributed to the sample (n = 546) through email messages, with a web-based link to the questionnaire. The email message also contained a detailed information sheet that explained the purpose of the survey, ethical considerations and a request for consent to willingly participate in the survey. The respondents were provided with the opportunity to opt out of the survey, either at the start or during the survey.

Response rate

A total of 151 responses were received which resulted in a response rate of 27.66%. A follow-up reminder was sent to everyone who had not yet responded within the stipulated return date and this was followed by a final reminder thereafter. The results indicated that 64.53% of those who opened the mail message went on to populate the questionnaire (fully or partially), although some opted out. The high number of unopened emails can possibly be attributed to the firewalls and filters that many organisations have, given that such firewalls can identify messages sent in bulk as spam mail. As pointed out by a participant in the sample group, such spam mail is often not seen by recipients, as they do not always open their spam mail.

Data analysis

The data was analysed for descriptive statistics using the software Statistical Package for the Social Sciences (SPSS) Version 20. The results of the closed-ended questions were presented using descriptive statistics such as frequency tables and charts which were plotted using SPSS and Microsoft Excel. Comparisons of Likert scale variables were cross-tabulated and the categorical data sets were analysed in terms of the Shapiro-Wilk normality test, followed by Mann-Whitney, Kruskal-Wallis statistical tests and the Dunn-Bonferroni post hoc test of pairwise comparisons. Open-ended questions were reported in descriptive statements. A summary of selected results is given below.

 

RESULTS

Table 1 shows the distribution of both the invitations and the responses received. Of the 151 responses received, 124 were complete and 27 partly completed.

The overall response rate of 27.66% is deemed satisfactory, given that a response rate of between 10% and 20% of the population in descriptive research is considered to be reasonable (Gay et al 2011). Archer (2008) pointed out that, if the primary goal of the survey was to measure quality, then low response rates may yield just as meaningful results if a reasonable breadth and range of the population sample was reached, as was the case in this study.

Employment of civil engineering diplomates

It was deemed useful to establish, firstly, how many civil engineering diplomates from each of the two universities in the Eastern Cape had been employed by the respondents in the last five years. The results of the number and distribution of employment provided by the respondents are shown in Figures 1 and 2.

The average number of diplomates employed by the respondents from universities 1 and 2 were 4.98 and 5.01, respectively. This indicates that the respondents had an almost equal recruitment rate of diplomates from the two universities that offer civil engineering diplomas during the five-year period (2012-2016) in the Eastern Cape.

Rating of graduate competencies

The question that was posed to participants for this section was: "On average, how would you rate the competency of diplomates from universities within the Eastern Cape (i.e. diplomates from University 1 and University 2) in the following areas?" The respondents were required to rate each of the following competencies:

Ability to apply scientific and engineering knowledge

Ability to conduct/investigate well-defined problems

Ability to use appropriate engineering methods, skills, tools and Information Technology

Engineering design ability

Independent learning ability

Individual and teamwork ability

Problem-solving ability

Professional and technical communication ability

Show an understanding of the impact that engineering activities can have on society

Understanding of and alignment with engineering professionalism

Understanding of workplace practices. The above competencies are similar to those contained in the ECSA Qualification Standard (ECSA 2016). However, and for practical reasons, abbreviated wording was used given that the full ECSA descriptions are quite lengthy. The respondents had to make a choice on the competency of diplomates based on the following rating scale: "very low", "low", "neither high nor low", "high" and "very high". In order to assess the importance of the competencies (on the basis of the responses), the ratings were treated as a five-point ordinal Likert scale, with a linear allocation of values (1, 2. 3, 4, 5) given to the rating increments from "very low" to "very high". It is acknowledged that it cannot be stated for certain that the interval between "very low" and "low" is equivalent to the interval between "low" and "neither high nor low". The practice to treat ordinal data as though it were interval level data, and conduct statistical tests that are appropriate for interval level data, is not uncommon in research in the social sciences, and may be permissible if the statistical analytical procedure that one is con sidering, as well as the substantive meaning and the interpretability of the statistics one is computing, is based on informed, sound judgement (Virginia Tech n.d.; Griesel & Parker 2009; Nic & Rarr 2013).

The results were analysed using a two-prong approach consisting of an analysis based on the overall cohort response, as well as an analysis based on subgroup responses. The respective results obtained are indicated below.

Graduate competency ratings based on overall response

The mean ratings of respondents on the competency of civil engineering diplomate are presented in Figure 3.

The overall mean of eleven ratings was 3.07. The "individual and teamwork ability" competency had the highest rating at 3.36. The competencies of "show understanding of the impact that engineering activities can have on society" and "ability to use appropriate engineering methods, skills, tools and Information Technology" were the competencies with the second and third highest ratings. The "engineering design ability" competency received the lowest rating at 2.87, followed by the competencies of "professional and technical communication ability" and "problem-solving ability". The top and bottom ratings broadly concur wit the findings of Nielsen (2000), and Griesel and Parker (2009), who also reported low competency ratings for newly qualified graduates and diplomates in the areas of communication, understanding workplace practices and problem-solving abilities.

Statistical testing was performed to establish whether the distribution of the data met the statistical requirements of normality. Establishing whether the data meets the criteria for normality is necessary to ensure that the appropriate statistical testing method is used when analysing the data further. Table 2 shows the results of skewness, kurtosis and the Shapiro-Wilk normality test, which are three statistical methods to assess normality. Skewness relates to the symmetry of the distribution (or lack thereof), while kurtosis indicates the pointiness (degree of peakedness) of the distribution. In a perfectly normal distribution the values for skewness and kurtosis would be zero. If the statistic for skewness and kurtosis is not zero a z-score can be calculated as follows:

A z-scores greater than 1.96 or less than -1.96 is significant at p < 0.05, which means that the distribution of such data meets the criteria of normality. The z-scores calculated indicate that some, but not all of the data sets considered met the criteria of normality.

In addition to skewness and kurtosis, the data sets were subjected to the Shapiro-Wilk normality test. This test compares the scores in the sample to a normally distributed set of scores with the same mean and standard deviation. The null hypothesis for this test is that the "sample distribution is normal" with the test significant if p < 0.05. The Shapiro-Wilk test results, as shown in Table 2, have p values below 0.05 for all the data sets. This signifies a rejection of the null hypothesis, which implies that none of the data sets should be treated as normally distributed and non-parametric statistical testing should be applied in the analysis of the data.

Figure 4 shows the distribution of ratings for the "individual and teamwork ability" competency, which is the competency with the highest mean.

Figure 5 shows the distribution of ratings for the "engineering design ability" competency, which is the competency with the lowest mean.

Diplomate competency ratings based on subgroup responses

The respondents were divided into sub-groups, consisting of various categories which were drawn from the demographics collected in the questionnaire, in order to establish whether there were any significant differences in the ratings within a particular group. The compilation of these subgroups was based on regional location, organisation type, organisation size, academic qualifications, professional status, field(s) of specialisation, job designation, gender, age, race and years of experience.

Non-parametric statistics were used for the analysis of the data sets. The results were further analysed to establish whether there were significant differences in the responses provided in terms of the respective subgroups, which included regional location of the respondent, the type of organisation, the size of their organisation, their academic qualifications, professional status, field(s) of specialisation, job designation, gender, age, years of experience, and race.

Non-parametric statistics, which do not assume normality in the distribution of the data, were used to compare the ordinal variables (competencies) with the categorical variables (subgroups). Where the categorical variable consisted of two independent groups, e.g. "gender" (female and male) and "race" (black and white), the Mann-Whitney test was used, and instances where the categorical variable consisted of three or more independent groups, the Kruskal-Wallis test was used. Both tests make use of ranked data. A significance level of 5% was used in all hypothesis tests, which meant that if p < 0.05, the null hypothesis that "there are no differences between the distributions of competencies among the subgroup", was rejected.

Figure 6 shows the results for the Mann-Whitney test, together with the distribution of the data for the competency "problem-solving ability" within the subgroup "race". The p value (asymptotic significance) for the example shown is less than 0.05, which means that the null hypothesis is rejected.

Figure 7 shows the results for the Kruskal-Wallis test for the competency "understanding of workplace practices" within the subgroup "organisation type". The test summary results are supplemented with a box-and-whisker diagram, with the median displayed as the solid horizontal line, the box depicting the spread and skewness of the data, the lines extending vertically from the boxes indicating the variability outside the upper and lower quartiles, and the outliers shown as individual points (the small circles). In the example shown in Figure 7, the p value (asymptotic significance) is 0.005, which is less than the significance level of p < 0.05 set for the hypothesis test, which means that the null hypothesis that "there are no differences between the distributions of competencies among the subgroup", is rejected.

Significance in the Kruskal-Wallis omnibus test only indicates that at least two of the tested parameters are statistically different, but does not specify exactly where these differences are. Multiple comparison tests were performed to assess whether the distribution of competency ratings differed with statistical significance within the subgroups.

Table 3 shows the post hoc test results from analysing the data described in Figure 7. This indicates that the statistical difference within the specific subgroup wa located in the Consulting-Government/ semi-government pairings, as a result of the p value being below 0.05.

The darkly shaded cells in Table 4 indicate the categories within a subgroup for which the distribution of competency ratings differ with statistical significance. Of the eleven competency descriptions that were tested, the highest number of significant differences were found to be within the subgroups "organisation type" and "race". There were no significant differences in the distribution of competency ratings for the subgroups "regions (geographical location of respondents)" and "gender".

In terms of individual competencies, respondent ratings for the competency "ability to use appropriate engineering methods, skills, tools and IT" showed no significant differences across any of the seven subgroups. The competencies "engineering design ability", "ability to conduct/ investigate well-defined problems" and "understanding of workplace practices" had the highest number of significant differences across the subgroups.

The data was also analysed in terms of the ranked mean competency ratings for each subgroup. Table 5 shows the ranked mean values for all subgroup categories across the range of competencies. Colour coding, with dark red on the lower end and dark green on the upper end, was added to the table to accentuate the differences within subgroups and across subgroup categories.

Table 6 shows the same data as in Table 5 but with the data re-arranged and ranked in terms of the mean competency ratings across all the subgroup categories.

The spread in the overall mean ratings range between 2.60 and 3.53. Graduate ratings for the subgroup category "respondents with 40+ years of experience" yielded the lowest competency rating score, while the subgroup category of respondents "Government/semi government" rated diplomate competencies the highest. Lower ratings were given by older respondents, those holding senior positions and those who belonged to the white population group. Small differences were observed in the ratings within the respondent sub-groups for "gender" and "regions".

The results from Tables 5 and 6 are more comprehensible if the data is visually represented in the form of radar and bar charts. Figure 8 represents a bar chart to illustrate the distribution of ratings across the gender subgroup, while Figure 9 is a spider chart showing the ratings for the subgroup based on the respondents' experience.

Qualitative comments from respondents on competency ratings

The respondents were invited to add comments while answering the questions. A number of respondents indicated that it was difficult to generalise, as the competencies of diplomates varied on an individual basis. Three of the respondents described this phenomenon as follows:

Difficult to answer as this depends highly on the individual.

It is difficult to put people in a box and they are very different.

Take note that company culture plays a big role on the expressiveness of the young graduates and gender.

The value of preparing a student for what is expected in the workplace before he/ she actually diplomates, appears to have a positive impact on respondent ratings. As one respondent stated: "The incumbents are in most cases students who have been in the employ of the company and have thus been trained and coaxed in these attributes, hence the good scores."

Some respondents commented on the inability of diplomates to solve basic problems and the need to re-train them. The following, somewhat lengthy comment from one respondent summarises it:

"I think the areas covered are fine; however, the level of skill in these areas are generally very low. All the graduates we employed had to be trained from "ground level". They generally say things like, "I think we did something like that at school," but they have very poor knowledge and design knowledge, even with basic calculations like channel flow, simply supported beams, how to specify a pump, etc, which. they cannot do. We first train them to use AutoCAD and use them for drawings until they understand how things fit together. We then slowly start training them in basic design and report writing. We found that we cannot give them a basic problem and leave them to design it."

 

DISCUSSION

A survey questionnaire was structured to gain insight on employer perceptions on the competencies of civil engineering diplomates, who qualified from universities within the Eastern Cape. The questionnaire focused on the views of employer representatives from the Eastern Cape Province, and considered the profile of individual respondents and the type of organisations they represented. Ratings were collected for eleven descriptive competencies, and respondent results evaluated as a collective and in terms of subgroup cohorts.

For the respondents, as a collective, the quantitative results of this study indicate a mean overall diplomate competency rating of 3.07 (on a scale between 1.00 and 5.00), with the spread for the eleven competency ratings ranging between 2.87 and 3.36. The narrow range within which the competency is located points to the competencies deemed to be strengths as being "not very strong", and the competencies viewed as weaknesses "not being very weak". The three highest ranked competencies were "individual and teamwork ability", "show understanding of impact that engineering activities can have on society" and "ability to use appropriate engineering methods, skills, tools and IT". The "individual and teamwork ability (M = 3.36)" competency has a rating well above the rest of the other assessed competencies and its consistent high rating across all subgroups points to a general satisfaction amongst all types of employers with the ability of diplomates to function meaningfully, both as individuals, and in team activities, when making their transition from university to the workplace. All the competencies, except for the highest-ranked, and two bottom-ranked competencies, yielded rating scores marginally above the scale midpoint of 3.00, which represents a competency level of "neither high nor low". It can, therefore, be deduced that, from an overall sector perspective, diplomates only just meet the minimum expectation levels of employers for the majority of the competency descriptors.

The employers, as a collective, rated "engineering design ability" (M = 2.87), "professional and technical communication ability" (M = 2.93) and "problem-solving ability" (M = 3.03) as the weakest diplomate competencies. Within the consulting sector, a category within the subgroup "organisation type", the rating for the same three competencies were even lower (M = 2.79, M = 2.81 and M = 2.85). The qualitative comments from the respondents affirmed the findings of the quantitative part of the survey on what the competency weaknesses of the diplomates were. Comments from respondents also highlighted the diplomates' inability to apply their theoretical knowledge when solving real-world problems, poor technical writing ability and a lack of technical skills in approaching and undertaking design tasks.

An analysis was also done to establish how sectoral views on mean competency ratings differed across categorical sub-groups. Categories within the subgroup

"gender" showed the least difference (AM = 0.07) while the highest difference occurred within the subgroup "experience" (AM = 0.61). Substantial differences were also observed for categories within the sub-groups "position" (AM = 0.47), "age" (AM = 0.51), "race" (AM = 0.52) and "organisation type" (AM = 0.58). The ratings appear to indicate that older, more experienced respondents from the white population group are less satisfied with diplomate competencies than both younger and black respondents. The results also show that respondents from the consulting sector rate diplomate competencies substantially lower than respondents from government, semi-government and municipalities. The diversity in employer responses across the subgroup "organisational type" confirms Holland's theory on vocational behaviour that there are distinct self-perceptions and patterns regarding what competencies are valued and needed in each work environment.

The findings of this study also point to the need for providers offering the Civil Engineering Diploma programme in the Eastern Cape to structure the curriculum towards improving the development of diplomate competencies in communication skills, computer skills and critical thinking ability.

 

CONCLUSION

Higher education institutions need to be receptive to the views of industry and should devise measures to engage, both actively and constructively, with employers on the quality of their graduates and diplomates. Employer studies and surveys are instruments that can be used to strengthen the interface between higher education and the world of work. The collection of quantitative and qualitative information enables education providers, offering vocationally orientated programmes such as the Civil Engineering Diploma qualification, a better insight into industry's views. This information is helpful when identifying the nature and extent of competency gaps that may exist, or may be perceived to exist. It is suggested that engineering departments collaborate with employers across the sector in order to establish, articulate and prioritise competencies, and that sufficient attention be given to regional and provincial competency needs. Whilst some employers may have unrealistic expectations of what diplomates can do immediately after completion of their studies, understanding the forms of "knowing" that industry value, is useful, if not essential, for curriculum review. Such information is also necessary to ensure that pedagogic approaches and assessment practices are aligned with the outcomes of the programme. Prioritisation as to what and how teaching takes place is critical, as time limitations make it impossible for the curriculum to expose and prepare students to the magnitude of the real world of work variables. It is suggested that the gap between the world of work and the classroom can best be bridged if teaching simulates engineering practice, a view that is corroborated by the findings of Passow and Passow (2017), following their worldwide investigation on the intersection of the concepts of competence, engineering, practice and importance.

This study indicates that problemsolving remains at the core of engineering practice, and that employers value diplomates who have a thorough understanding of engineering fundamentals, conceptual principles and procedural approaches. Teaching methods need to develop student ability to grasp the 'what' and 'how' of problems. It is suggested that problem-based and project-based teaching strategies be used for selected subjects in the curriculum, as a method to bridge the knowledge-practice-environment gap. This is a similar view to that expressed by Ballim et al (2014).

This study also highlights how important soft skills, and in particular communication skills, are in the workplace. This points to the need for a competency such as communication skills to be integrated within multiple technical subjects. Assessing this competence at multiple levels in the curriculum will ensure that this competence is developed beyond the singular, low-level learning area where this competence is currently being taught.

Lastly, studies such as this may be useful to determine what the competency emphasis is amongst employers at both a regional and provincial level, in addition to what is prescribed at a national level. Assessing the responsiveness of a programme by seeking the views of employers remains an important part of the quality assurance process. The baseline information, as was collected through this study amongst civil engineering employers in the Eastern Cape, can be useful when reviewing academic programmes, such as the Civil Engineering Diploma programme.

 

REFERENCES

Allahverdi, A & Aldowaisan, T 2015. Assessment of student outcomes of the Industrial and Management Systems Engineering programme at Kuwait University. Global Journal of Engineering Education, 17(3): 103-112.         [ Links ]

Archer, T M 2008. Response rates to expect from web-based surveys and what to do about it. Journal of Extension, 46(3), 3RIB3.         [ Links ]

Ballim, Y, Mabizela, S & Mubangizi, J C 2014. Professional bodies and quality assurance of higher education programmes in South Africa: Towards an appropriate framework. South African Journal of Higher Education, 28(4): 1140-1155.         [ Links ]

Besterfield-Sacre, M, Cox, M F, Borrego, M, Beddoes, K & Zhu, J 2014. Changing engineering education: Views of US faculty, chairs, and deans. Journal of Engineering Education, 103(2): 193-219.         [ Links ]

CBE (Council for the Built Environment) 2014. Is there a gap between graduate competencies and expectations? Available at: http://www.cbe.org.za/pages/publications/articles[accessed on 16 August 2015].         [ Links ]

CHE (Council on Higher Education) 2012. Criteria for programme accreditation. Higher Education Quality Committee (revised). Pretoria: CHE.         [ Links ]

CHEC (Cape Higher Education Consortium) 2013. Pathways from university to work: A graduate destination survey of the 2010 cohort of graduates from the Western Cape Universities. Cape Town: CHEC.         [ Links ]

Department of Higher Education and Training 2016. List of occupations in high demand. Government Gazette 39604, Notice 34, 19 January 2016. Pretoria: Government Printer.         [ Links ]

Du Toit, R & Roodt, J 2008. Engineering professionals: Crucial key to development and growth in South Africa. Report commissioned by the Department of Labour for the Scarce and Critical Skills Research Project. Pretoria: HSRC.         [ Links ]

ECSA (Engineering Council of South Africa) & EASA (Engineering Association of South Africa) 1995. Joint submission to the National Commission on Higher Education by the Engineering Council of South Africa and the Engineering Association of South Africa. Johannesburg: ECSA & EASA.         [ Links ]

ECSA (Engineering Council of South Africa) 2010. The Engineering Skills Pipeline. Available at: http://www.ecsa.co.za/documents/News/241110_Engineering_Skills_Pipeline.pdf[accessed on 12 August 2015].         [ Links ]

ECSA (Engineering Council of South Africa) 2013. Policy for the Accreditation of Engineering Technology Programmes. Section 1. Available at: https://www.ecsa.co.za/ECSADocuments/ECSA%20Documents/Documents/Accreditation_Universities_Technology_Section_1.pdf[accessed on 1 August 2015].         [ Links ]

ECSA (Engineering Council of South Africa) 2014. E-12-P: Revision 2: Programme Self-study Documentation Requirements for Accreditation Visits. Available at: https://www.ecsa.co.za/education/EducationDocs/E-12-P.pdf[accessed on 1 August 2015].         [ Links ]

ECSA (Engineering Council of South Africa) 2016. Qualification Standard for Diploma in Engineering: NQFLevel 6. Document E-02-PNRev-3. Available at: https://www.ecsa.co.za/education/EducationDocs/E-02-PN.pdf[accessed on 30 January 2016].         [ Links ]

Elrod, C C, Stanley, S M, Flachsbart, B B, Hilgers, M G & Cudney, E A 2015. Assessing professional behavior perceptions between computing, engineering, and business students and hiring employers. Issues in Information Systems, 16(1): 69-79.         [ Links ]

Gay, L R, Mills, G E & Airasian, P W 2011. Educational research: Competencies for analysis and applications. Boston, MA: Pearson.         [ Links ]

Griesel, H & Parker, B 2009. Graduate attributes: A baseline study on South African graduates from the perspective of employers. Pretoria: Higher Education South Africa & SAQA.         [ Links ]

Gwyn, M L & Gupta, R 2015. Co-op employer evaluation of the graduate attributes: A comparison of two approaches. Proceedings, Canadian Engineering Education Association Conference (CEEA15), 31 May to 3 June, Ontario, Canada, Paper 034, pp 1-5.         [ Links ]

Hauschildt, U, Brown, H & Zungu, Z 2013. Competence measurement and development in TVET: Result of the first COMET test in South Africa. Proceedings, Apprenticeship in a Globalised World: Premises, Promises and Pitfalls. Munich: LIT, pp 177-180.         [ Links ]

Klein-Gardner, S S & Walker, A 2011. Defining global competence for engineering students. Proceedings, ASEE Annual Conference & Exposition, Vancouver, Canada, pp 22.420.1-22.420.17.         [ Links ]

Lawless, A 2007. Numbers and needs in local government. Midrand: SAICE.         [ Links ]

Lawless, A 2011. Training/development: Demand/supply of technical engineers within the public sector. Proceedings, CESA 2011 Conference and Exhibition, East London.         [ Links ]

Naicker, S 2017. Developing a Code of Best Practice for civil engineering education at tertiary institutions. Civil Engineering, 25(1): 9-12.         [ Links ]

Nielsen, A C 2000. Employer satisfaction with graduate skills. Canberra, Australia: Department of Employment, Education, Training and Youth Affairs.         [ Links ]

Ngetich, W K & Moll, C M 2013. An investigation of industry expectations of industrial engineering graduates: A case study of graduate development programmes in South African universities. South African Journal of Industrial Engineering, 24(3): 125-138.         [ Links ]

Nic, D & Rarr V 2013. Oh ordinal data, what do we do with you? Learn and Teach Statistics and Operations Research. Available at: https://learnandteachstatistics.wordpress.com/2013/07/08/ordinal[accessed on 27 January 2017].         [ Links ]

Passow, H J 2012. Which ABET competencies do engineering graduates find most important in their work? Journal of Engineering Education, 101(1): 95-118.         [ Links ]

Passow, H J & Passow, C H 2017. What competencies should undergraduate engineering programs emphasize? A systematic review. Journal of Engineering Education, 106(3): 475-526.         [ Links ]

Peng, L, Zhang, S & Gu, J 2016. Evaluating the competency mismatch between Master of Engineering graduates and industry needs in China. Studies in Higher Education, 41(3): 445-461.         [ Links ]

Ridgman, T W & Liu, M 2014. Engineering graduate employability - An international comparison. Proceedings, SEFI Annual Conference, Birmingham, UK, Paper ID 102.         [ Links ]

Riordan, S & Goodman, S 2007. Managing reality shock: Expectations versus experiences of graduate engineers. South African Journal of Industrial Psychology, 33(1): 67-73.         [ Links ]

SAICE (South African Institution of Civil Engineering) 2016. (a) SAICE Industry Survey per SAICE Newsletter, 18 May 2016 (http://saice.org.za/saice-industry-survey), (b) SAICE University Survey per SAICE CEO e-mail dated 11 October 2016.         [ Links ]

Singer, S R, Nielsen, N R & Schweingruber, H A 2012. Discipline-based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering. Washington, DC: National Academies Press.         [ Links ]

Taylor, N 2015. The production of high level science, engineering and technology skills: Role of the Dinaledi Project. Pretoria: HSRC.         [ Links ]

Teichler, U & Höhle, E A 2013. The Work Situation of the Academic Profession in Europe: Findings of a Survey in Twelve Countries. Dordrecht, Germany: Springer.         [ Links ]

The Presidency of South Africa 2010. Joint Initiative on Priority Skills Acquisition (JIPSA): Growing priority skills in South Africa. Final report on JIPSA. Pretoria: The Presidency.         [ Links ]

Vadivu, M C, Bala, M L & Sumathi, N 2016. Expectation vs performance - A skills gap analysis among engineering graduates. Indian Journal of Applied Research, 5(10).         [ Links ]

Virginia Tech n.d. Measurement scales in social science research. Available at: http://simon.cs.vt.edu/SoSci/converted/Measurement(accessed on 17 January 2017).         [ Links ]

Walther, J & Radcliffe, D F 2007. The competence dilemma in engineering education: Moving beyond simple graduate attribute mapping. Australasian Journal of Engineering Education, 13(1), 41-51.         [ Links ]

Watermeyer, R & Pillay, M 2012. Strategies to address the skills shortage in the delivery and maintenance of infrastructure in South Africa: A civil engineering perspective. Civil Engineering, 20(5): 46-56.         [ Links ]

Wolff, K 2017. Throwing the baby out with the bathwater? The role of fundamentals in 21st century engineering education. Proceedings, Global Engineering Education Conference (EDUCON), 2017 IEEE, pp 111-118.         [ Links ]

 

 

Correspondence:
F Gerber
Faculty of Science, Engineering and Technology Buffalo City Campus, Walter Sisulu University
POBox 1421, East London 5200. South Africa
T: +2 7 43 709 4753 E: fgerber@wsu.ac.za

K J Mammen
School of General and Continuing Education University of Fort Hare
Private Bag X9083, East London 5201. South Africa
T: +2 7 43 7 0 4 7178 E: kmammen@ufh.ac.za

 

 

 

DR FERDIE GERBER Pr Eng (MSAICE) is Project Manager for Academic Programme Development within the Faculty of Science, Engineering and Technology at Walter Sisulu University (WSU), South Africa. He has 38 years' engineering experience and has worked across various industry sectors, including construction, local and provincial government, parastatals, consulting and academia. He has been with WSU for the last 18 years, during which time he has held various senior management positions. His research interests include engineering education and engineering management.

 

 

PROF KUTTICKATTU JOHN MAMMEN is Professor at the University of Fort Hare. He holds BSc, LLB, BEd, MSc, MEd and DEd degrees, and has more than three decades of experience in teaching, research and postgraduate supervision. He has successfully supervised 14 Master's dissertations and Doctoral theses, and has published 42 research papers. His research interests include Maths, Science and Technology Education, Comparative Education and Higher Education Management.

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons