SciELO - Scientific Electronic Library Online

 
vol.16 issue2Application of the contingent valuation method to estimate the willingness-to-pay for restoring indigenous vegetation in Underberg, Kwazulu-Natal, South AfricaEthics in economic and management sciences: A researcher's resource author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


South African Journal of Economic and Management Sciences

On-line version ISSN 2222-3436
Print version ISSN 1015-8812

S. Afr. j. econ. manag. sci. vol.16 n.2 Pretoria Jan. 2013

 

ARTICLE

 

Managing postgraduate research service quality: developing and assessing a conceptual model

 

 

Krishna GovenderI; Shaun RamroopII

IDepartment of Management Studies, University of KwaZulu-Natal
IIDepartment of Statistics, University of KwaZulu-Natal

 

 


ABSTRACT

This paper reports on the conceptual development and empirical evaluation of a postgraduate (PG) research service quality management model, through conducting an electronic survey among a cohort of master's and doctorate graduates at one of the top three research universities in South Africa, using specifically developed and validated research instruments.
By fitting the data from a sample of 117 graduates to a conceptual model using structural equation modelling, it became evident that the PG research students' perception of the Organisational Climate for Research (OCR) is associated with their perception of the PG Research Service Quality (PGSQUAL), the PG Service Experience (SERVEXP) and their perception of their Role (RC). However, no association was found between the students' perception of the research climate (OCR) and their satisfaction (SERVSAT) with the research service; the service experience (SERVEXP) and postgraduate research service quality (PGSQUAL); service satisfaction (SERVSAT) and postgraduate service quality (PGSQUAL).
The aforementioned findings imply that higher education institutions need to create a research climate which is supportive of service quality, and better manage the research climate, so that the PG students are clear about their role, which will eventually translate to a better PG service experience and improved perception of PG service quality.

Key words: organizational climate, service satisfaction, service experience, service quality, postgraduate


 

 

1 Introduction

Every time a customer interacts with a service organisation (by phone, mail, technology or in person), a service encounter occurs. This interaction according to Bettencourt and Gwinner (1996:3) becomes the focal point in consumers' evaluation of the entire service organisation, since service experiences are developed as a result of these encounters, and each encounter provides an opportunity not only for the service organisation to either impress or disappoint the service customer, but for the customer to develop perceptions of the service experience and quality. Given that education is a service, and the PG research education environment has become increasingly competitive (Angell, Heffernan & Megicks, 2008), postgraduate-based research focusing on the PG service encounter has been surprisingly negligible. This void is even more concerning since there is intense competition for PG students who not only generate greater income, but also improve a particular university's ranking.

To address the void, by drawing heavily on the services marketing (quality) literature, a conceptual model of the PG research service encounter is developed, and relationships among the PG research students' perception of their role and the research climate on their service experience, service satisfaction and service quality are postulated and assessed. The aforementioned objective was addressed by:

suggesting a conceptual model of the PG research service encounter and proposing relationships among the PG research students' perception of their role, the research climate, the service experience, service satisfaction, and service quality;

developing and validating research instruments to assess the PG research students' perception of their role, the research climate, their service experience, service satisfaction, and service quality; and evaluating the proposed conceptual model using structural equation modelling.

The study is organised as follows: following a brief discussion of the literature on the PG service encounter, more specifically the PG students' participation (role), the research service climate, the research service experience and research service quality, several relationships are proposed among the aforementioned variables. Thereafter, the research methodology used to explore the proposed relationships, together with an explanation of the development of various research instruments follows. The empirical findings, including an explanation of the process of validation of the research instruments, as well as reporting the results of the data analysis, and a discussion thereof follows. The paper concludes by identifying some limitations of this exploratory study and proposing scope for future research.

 

2 The PG service encounter, the PG students' role, service experience and service quality

Although, in PG research supervision, face to face interaction or personalisation between the student and research supervisor is vital to the outcome of the service experience, there are many opportunities for things to go wrong when the student and supervisor interact, since both parties experience and respond to each other's mannerisms, attitude, competence, mood, language and so on, resulting in incidents which may be trivial or important to a successful encounter. The important or 'critical' incidents according to Bitner, Booms and Mohr (1994) are interactions that are either satisfying or dissatisfying which provide an opportunity for the customers, through their service experience, to form an opinion of the service quality.

When customers participate in the service production and delivery process, each customer who assumes partial work responsibilities may not perform a portion of the work, since they do not understand the service offering or their role in the service encounter. This reasoning can be extended to the weak PG student who for example, may have a poor command of the English language, and thus depend on the supervisor to correct the grammar and other writing style aspects. According to Hsieh and Yen (2005:892) this could result in the service providers' job stress which may by deduction be transferred to the service performance on the part of the study leader or service employee, and result in a poor service experience for the student or customer.

Although some researchers (Levitt, 1972; Chase, 1978) argue that less direct contact between the customer and the service production system can contribute to efficiency, others (Mills, Chase &Margulies, 1983) concur that services can be delivered most efficiently if customers are viewed as 'partial' employees and if their participative roles are designed to maximise their contributions to the service delivery process. According to McCulloch (2009:171) a more appropriate metaphor to characterise the relationship between the student and the higher education service provider is one of 'co-production', since students, lecturers and others who support the learning are viewed as being engaged in a cooperative enterprise, focused on production, dissemination and application of knowledge.

Dann (2008:333) cautions that PG research supervision is a complex service encounter drawing on the pedagogical structures of higher education and the interpersonal dynamics of a highly customised service delivery. Within this relationship, there is a duality of responsibility (and roles) for the successful completion of the research project between the PG student and supervisor. Thus, the PG experience cannot be attributed entirely to the student, the supervisor or the institution, since service experiences are the outcomes of interactions between organisations, related systems and processes, service employees and customers (Bitner, Faranda, Hubbert & Zeithaml, 1997:193).

Some researchers (Singh, Vebreke & Rhoads, 1996) contend that, when an individual has gained an appropriate understanding of his/her role and the roles of other members of the role set, it is possible to formulate accurate role expectations. Some researchers (Boshoff & Mels, 1995) concur that role ambiguity affects the service employee's performance which spills over to the perception of service. Considering that the service customer is a quasi-employee, the aforementioned may be equally true about the PG research student who is the 'service customer' in this study.

In order to explore the PG research students' role further, and more especially its association with the PG service experience, it is proposed that:

P1: There is a positive relationship between the PG students' perception of their Role (RC) and their perception of the PG service experience (SERVEXP).

P2: There is a positive relationship between PG students' perception of their Role (RC) and their overall satisfaction (SERVSAT) with their PG service.

According to Schneider and Bowen (1995), since the interaction which takes place between the service organisation, its employees and customers during the service encounter can often not be clearly specified beforehand, the climate of the organisation offers an ad hoc means of specifying the activities which should be carried out. Given the aforementioned, especially the implied importance of the organisational climate, the next section will briefly expand on the concept organisational climate so as to properly locate its relevance in the PG research service encounter, and explore its association with the PG students' perception of the PG research service quality, the PG research service experience, and PG service satisfaction.

The postgraduate research climate, the service experience and service quality

Over the years, several explanations have emerged about the dimensions that constitute the organisational climate construct, and Tyagi (1982) identifies several organisational climate variables which were found to be antecedents of attitude and performance. The antecedents are: job challenge and variety, job importance, task conflict, role overload, leadership consideration, organisational identification, and management concerns and awareness. Attitude and performance, Tyagi (1982) argues may influence the service in terms of delivery (performance) and quality.

Bock, Zmud, Kim, and Lee (2005:90) argue that 'to move from a culture for hoarding knowledge in order to gain power, towards one that rewards the sharing of knowledge (with PG students), we need to create a climate for service that fosters long-lived, trusting relationships'. The sharing of knowledge is a necessary trait for fostering a positive PG research climate, where both PG students and PG research supervisors could prosper.

When a product is not 'immediately' available (such as a PG degree), the service firms must rely on managing the tangibles such as the setting and contact personnel, to create a positive image for their intangible offering. Since PG service encounters do not take place in a vacuum but in a specific milieu, in order to manage PG service quality it is therefore important, to understand what happens during the PG service encounter, more especially, how the research service climate influences the perceptions of service quality, the service experience and service satisfaction.

Le Blanc and Nguyen (1997) ascertain that service quality is derived mainly from reputation, a factor which they propose is tied closely to management's capacity to foster an organisational climate directed at serving the needs of its customers, and to the image of the business school. Schneider, White and Paul (1998) stress that the service climate focuses the service employees' efforts and competencies in service delivery which, in turn, yields positive experiences for customers as well as positive customer perceptions of service quality. Dietz, Pugh and Wiley (2004) also assert that when excellent service is an important theme in an organisation, a positive service climate exists.

For the purpose of this study, the research climate is defined as the research student's perceptions of organisational policies, practices and procedures which promote a climate which recognises and rewards service to PG research students. This definition by implication suggests 'customer orientation' as an important facet of the research climate, and that much rests on the perceptions of the individual research supervisor, which influences his/her service delivery behaviour. Thus, the research climate which manifests itself through the organisational climate will depend on the fundamental support provided by HEIs through resources, training, managerial practices and assistance required to perform effectively (Schneider et al., 1998).

When employees form climate perceptions about the organisation, and about its subunits (school/department), they consider different elements of their work environment, forming distinct perceptions of the organisation-targeted and unit-targeted service climates. Consistent with the service climate theory, according to which a sub-unit's positive service climate facilitates delivery of excellent service and improves customer perceptions and reactions, this paper assumes that with specific reference to the PG research environment, the climate for research service (OCR) at the school/department level is developed from the university's research service climate. Since PG students interact more with the sub-unit (discipline/department/school) employee (supervisor), they are likely to develop their perception of the research climate through these interactions.

The organisational climate as it pertains to the higher education environment and, more especially, the PG students' perception and the impact thereof on the service of the supervisor, and the PG students' research service experience have not received much attention by researchers. In order to explore this further, it is postulated that:

P3 : There is a positive relationship between the postgraduate research students' (PGS') perception of the PG research climate (OCR) and their perception of their Role (RC).

P4: The PGs' perception of the OCR is associated with their perception of the overall research service quality (PGSQUAL).

P5: The PGs' perception of the OCR is associated with their perception of their overall research service experience (SERVEXP).

P6: The PGs' perception of the OCR is associated with their perception of their overall research service satisfaction (SERVSAT).

Service quality, service experience and service satisfaction

Parasuraman, Zeithaml and Berry (1988) argue that customer satisfaction is a precursor of service quality, and conclude that perceived quality is a form of attitude related, but not equivalent to customer satisfaction. However, according to Alridge and Rowley (1998), perceived quality is derived from the consumers' overall evaluation of the service, and quality can be distinguished from satisfaction, in that quality is a general attitude, whereas satisfaction is linked to specific transactions, and perhaps satisfaction with a series of transactions leads to perceptions of good quality.

The service quality SERVQUAL-SERVSAT-SERVPERF relationship has also been the topic of several studies, inter alia, Cronin and Taylor (1992). Some researchers (Bolton & Drew, 1991) argue that customer satisfaction is an antecedent of service quality, whilst others (Hoisington & Naumann, 2003; Carrilat, Jaramillo & Mulki, 2009) assert that service quality leads to customer satisfaction and behavioural intentions. Cronin and Taylor (1992) argue that the distinction between satisfaction and quality is important because service providers need to know whether their objective should be to deliver satisfied customers, who will then develop a perception of high service quality, or whether they should aim for high service quality aimed at customer satisfaction.

In order to further explore the relationships among service quality, service experience and service satisfaction, with specific reference to the PG research service encounter, it is postulated that:

P7: The PGs' perception of the PGSQUAL is associated with their SERVEXP.

P8: The PGs' perception of the PGSQUAL is associated with their SERVSAT.

P9: The PGs' perception of their SERVEXP is associated with their SERVSAT.

The postulated relationships (P1-P9) which are depicted as a conceptual model in Figure 1, were empirically evaluated following the methodology described next.

 

 

3 Methodology

In order to empirically evaluate the postulated relationships (P1-P9), an electronic survey was conducted among a cohort of master's and doctorate graduates of a large research university over a month (April-May 2011) using QuestionPro (www.QuestionPro.com). The name list with e-mail contact details of the graduates was obtained from the university graduations office, and the electronic version of the questionnaire was sent via e-mail to all 816 graduates, who were asked to follow a link to the survey website. This was supported by hard copies of the questionnaire accompanied by an explanatory letter explaining the objectives of the survey, distributed at the graduation venues in special envelopes together with the degree certificates. Although the graduates were asked to either return the completed questionnaire or complete the electronic survey within a month from the date of the graduation, the majority chose to complete the electronic survey.

Research instruments

In order to address the research objectives, existing validated questionnaires (Parasuraman et al., 1988) used in similar previous studies in the business environment were adapted for use in the postgraduate research service encounter.

 

4 PG Service quality: PGSQUAL

Although there is substantial research evidence in the literature to support the use of the SERVQUAL (Parasuraman et al., 1988) instrument in measuring education service quality, this instrument has not been without criticism (Cronin & Taylor, 1992; Alridge & Rowley, 1998:200). Some of the criticisms (Alridge & Rowley, 1998:200) include the need to ask the same questions twice, and the fact that the instrument captures a snapshot of perceptions at one point in time. To overcome some of the criticisms, Alridge and Rowley (1998) opted to survey perceptions only and exclude expectations in their survey of student satisfaction. According to Hair (2006:11), the work carried out so far using SERVQUAL in a higher education context would seem to suggest that the instrument can be used successfully, as long as the modifications are kept to a minimum. However, the aforementioned author further states that there is little or no research specifically using SERVQUAL on PhD students or on PG research supervisors.

In their quest to develop better research instruments which are also more appropriate to the nature of the service, some researchers (Drennan, 2008) report on the PREQ (Postgraduate Research Questionnaire) which was introduced in Australia in 2002 against a background of increased attention on quality and accountability in the Australian higher education sector. The PREQ is a multidimensional measure of graduate students' experience of research and research supervision, and is based on the principle that the student's perception of research supervision, infrastructural and other support, intellectual climate, goals and expectations will influence their evaluations of the outcomes achieved as a consequence of their research experience (Drennan, 2008:490).

Other researchers such as Ginns, Marsh, Behnia, Cheng and Scalas (2009) further adapted the PREQ to develop the SREQ (Student Research Experience Questionnaire) to investigate PhD students' evaluations, in which the focus was on the overall postgraduate experience at the broad level of university and disciplines (faculties and departments) within a university, rather than at the effectiveness of the individual supervisor.

For the purpose of this study, the PGSQUAL (postgraduate research service quality) instrument was developed primarily by adapting the SERVQUAL instrument which encapsulates the perceptions-expectations gap covering all five service quality dimensions, and incorporating certain elements from the PREQ and SREQ questionnaires, as was done in previous studies (Stodnick & Rogers, 2008; Dann, 2008; Drennan, 2008). The final PGSQUAL instrument comprised 26 items resulting from adaptations which entailed making minor changes to the SERVQUAL statements to fit the context, combining expectations and perceptions, and incorporating certain elements of the PREQ and SREQ. With respect to the 26 items pertaining to specific aspects of the PG research service quality, the respondents were requested to indicate, on a 5-point Likert scale, whether the service quality was 'better' or 'worse' than expected, where 1 = Worse than expected and 5 = Better than expected.

Postgraduate research climate: OCLIMAR

To ascertain the PG research students' perception of the research climate (the organisational climate for research) the OCLIMAR instrument was developed by adapting Kelley's (1987) scale originally developed on exploratory research of Parasuraman et al. (1988) and further adapted by Govender (1998). A further development of the 22-item scale entailed incorporating certain relevant aspects of PREQ (Ginns et al., 2009) which resulted in a 24 OCLIMAR questionnaire.

The respondents were required, in terms of their perceptions of the importance the university placed upon various characteristics of the research service it provided, to indicate their agreement or disagreement with each of the 24 statements, expressed on a 5-point Likert scale, where 1 = Strongly Disagree; 2 = Disagree; 3 = Neither Agree nor Disagree; 4 = Agree and 5 = Strongly Agree.

PG research students' role perception

By adapting the Role Ambiguity/Role Clarity scale of Chonko, Howell, and Bellenger, (1986) a 6-item Role Clarity measurement instrument (Appendix A) was developed. Respondents were requested to indicate, with reference to their role as PG research students, how certain they were about each statement on a Likert scale where 1=Completely Uncertain; 2=Uncertain; 3=Neither Certain nor Uncertain; 4=Certain; 5=Completely Certain.

PG research students' overall research service experience: SERVEXP

According to Alridge and Rowley (1998:198), work on approaches to the evaluation of the student experience can be divided into two loosely bound categories, namely, methods that focus on assessing teaching and learning, and methods that assess the total student experience. Since the PG research students' experience is also a useful perspective to adopt for an approximation of student satisfaction in marketing terms and, based on the principles underlying the SREQ (Ginns et al., 2009) instrument, a 6-item SERVEXP questionnaire (Appendix B) was developed for specific use in this study.

The respondents were requested, with reference to their PG experience, to indicate their level of agreement with each of the 6 statements on a 5-point Likert scale, where 1=Strongly Disagree; 2=Disagree; 3=Neutral; 4=Agree and 5=Strongly Agree.

PG research students' overall service satisfaction: SERVSAT

Considering that the intention was to get an overall (summary) measure of the level of satisfaction with the PG research service, a single item Likert type question, with the following 5-point response categories: Strongly Disagree; Disagree; Neutral; Agree; and Strongly Agree, was used. The question read as follows: Overall, I was satisfied with the quality of my PG research experience.

 

5 Empirical findings

5.1 Response rate

The survey was conducted over a month (April-May 2011), during which period, weekly e-mail reminders were sent, encouraging the graduates to participate by completing the questionnaire.

Of the 816 graduates surveyed, 220 (26.96 per cent) respondents viewed the questionnaire, 120 of them (54.55 per cent) attempted it, and only 117 of these (53 per cent) completed the survey. It became evident from the data extracted via the electronic survey instrument (QuestionPro.com), that the average time taken to complete the questionnaire was 17 minutes.

The sample comprised 58 per cent black graduates, 23.2 per cent white graduates, 16.1 per cent Indian graduates, and the rest were unclassified. The majority of the graduates completed masters degrees, 35.1 per cent coursework master's and 37.7 per cent the full research master's degree and, the remaining respondents (27.2 per cent) had completed doctorates.

5.2 Reliability of the research instruments

Reliability has to do with the precision of the measurement procedure, and Coakes and Steed (2003:140) state that although there are a number of different reliability coefficients, one of the most commonly used is the Cronbach's alpha, and a value of 0.7 or higher is regarded as good, in that similar results will be obtained if this survey is conducted among a larger sample of respondents. The Cronbach's alpha values for the various research instruments reflected in Table 1, reveal good internal consistency or homogeneity of the items within the research instruments. The Cronbach's alpha value could not be calculated for the overall PG service satisfaction (SERVSAT) since it comprised a single item only, and a minimum of two items is required (Coaks & Steed, 2003).

 

 

5.3 Validity of the research instruments

Although, within the ambit of research instrument validity assessment, there are several aspects of validity are that are pertinent, considering that the research instruments were adapted from previous studies where they were validated, only face validity was considered in this study.

Face validity simply means that researchers are taking the validity at face value by looking at whether an instrument (questionnaire) appears to measure the target variable (Cronbach & Meehl, 1955). For example the PGSQUAL instrument is intended to measure the PG service quality and comprises 26 questions that pertain to PG service quality. At face value, these questions definitely measure the service quality as perceived by PG research students and, combined with the fact that they are also adapted from previous studies that measure service quality, the face validity of the PGSQUAL (and other research instruments) used was assured.

5.4 Dimensionality

The concept of dimensionality is inseparable from the study of measurement scales. Stevens (1946:677) states that scales of measurement in the broadest sense are the assignment of numerals to objects or events according to rules. The fact that numerals can be assigned under different rules leads to different kinds of scales and different kinds of measurement. The nature of a phenomenon in a scale determines its dimensionality. Scales can have a number of dimensions whilst a dimension can be thought of as a number line. The Likert-type scale that was used (the research instruments) in the current scientific setting is classified as a uni-dimensional scale measuring the perceptions of PG research students. For example, the PG service quality instrument (PGSQUAL) is made up of 26 items each having the same Likert scale. This scale was used in previous studies by researchers such as Stodnick and Rogers (2008), Dann (2008), and Drennan (2008). With respect to the 26 items pertaining to specific aspects or constructs of PG research service quality, the respondents were requested to indicate on a 5-point Likert scale whether the service quality was 'better' or 'worse' than expected, where 1 = Worse than expected and 5 = Better than expected.

Factor analysis was conducted using the Principal Components method with Varimax rotation to determine the validity of the items comprising the questionnaires developed to measure the variables whose relationship in the conceptual model is being investigated (Kline 1994; Johnson & Wichern, 2007).

Table 2 reflects the results of the factor analysis of the PGSQUAL research instrument. Although the literature (Kline, 1994), suggests that a factor loading of 0.3 or greater can be considered to be significant, given the large number of items in the PGSQUAL instrument, factor loadings of 0.4 or higher were considered to be significant, otherwise the number of items in the data set will not have be reduced, and the key reason for conducting a factor analysis, which is to reduce the number of items to a possible set of common factors of items, will have been defeated.

It is evident from Table 2 that all the 26 items loaded onto two factors with loadings exceeding 0.4. FACTOR 1, which was labelled 'Research Supervisor', comprised the following PGSQUAL items: SQ1-SQ16; SQ18 and SQ 23. FACTOR 2, which was labelled 'Institutional Support', comprised items SQ17; SQ19-25 and SQ26. The aforementioned two factors produced acceptable Cronbach's alpha values (0.978 and 0.910 respectively), which implied that the two factor PGSQUAL instrument revealed good internal consistency (Nunnaly & Bernstein, 1994).

The outcome of the factor analysis for the OCLIMAR instrument reflected in Table 3 reveals that three factors explain 67.48 per cent of the cumulative variance. With reference to the eigenvalues, three factors were extracted since their eigenvalues were larger than 1.

It is also evident from Table 3 that the 24 items of the OCLIMAR instrument loaded onto three factors. These factors were named 'Postgraduate Service Orientation' comprising items OCR1, OCR8, OCR11-OCR17, OCR19, OCR21 and OCR24; 'Postgraduate Research Support' comprising items OCR2, OCR7, OCR9, OCR18, OCR20 and OCR22-OCR23; and 'Postgraduate Information' comprising items OCR3-OCR6. These three factors produced Cronbach's alpha values of 0.954; 0.894 and 0.884 respectively, which implies that they are reliable (Nunnaly & Bernstein, 1994).

With reference to the Role Clarity, Service Experience and Service Satisfaction questionnaires, the researchers did not undertake a factor analysis since all three instruments had less than 10 items each (Coaks & Steed, 2003), and were therefore treated as uni-dimensional constructs.

 

6 Discussion of empirical findings

Before any of the structural equation modelling (SEM) was done, the data had to be manipulated since the variables compromised several items, for example, postgraduate service quality (PGSQUAL) compromised 26 items. The average of the 26 items was then taken to create a uni-dimensional variable for PGSQUAL. The average of the all the items that comprised the RC, OCR, SERVEXP and SERVSAT instruments were also calculated, to create uni-dimensional variables which are referred to as formative measures in the context of the structural equation modelling (SEM) literature (Hardin & Marcoulides, 2011). These variables were then used in the structural equation and multiple regression modelling processes.

The conceptualised model in Figure 1 was fitted to the sample data using AMOS. Byrne (2010:75-80) reports several goodness of fit statistics that are standard AMOS outputs and, some namely, chi-square, Relative Fit Index (RFI), Root Mean Square Error of Approximation (RMSEA) and, Incremental Fit Index (IFI), were considered, since most of them are interrelated, and lead to the same conclusion regarding the fit of the model to the sample data.

The RFI which should be close to 0.95 if the model fits the data well and, the RMSEA which should be less than 0.05 and, have a p-value greater than 0.05 based on a narrow confidence interval from the RMSEA (called PCLOSE in the AMOS output), indicating superior fit of the model (Byrne, 2010:80). The IFI which addresses issues of parsimony and sample size of the model relative to the data, which, according to Byrne (2010:79) is also a measure of the goodness of fit, should be close or more than 0.95, if very adequate model fit is to be achieved.

It became evident that the chi-square test statistic was 0.022 with 1 degree of freedom, and a p-value = 0.883, which is non-significant at the 5 per cent level, thus implying that the conceptual model fitted to the research data was indeed a good one (Byrne, 2010:76; Bollen, 1989:263). Furthermore, the RFI was 0.967, the RMSEA was 0.009 with a p-value (PCLOSE) of 0.899, and the IFI was reported as 0.969, all further confirming an excellent fit of the model. Table 4 shows that the organisational climate for research (OCR) significantly influences the PG students' role clarity (RC) (p-value = 0.000), thus supporting P3. There is also a significant association between the OCR and the PG students' overall research service experience (SERVEXP) (p-value = 0.000), supporting P5.

The aforementioned findings confirm that, irrespective of the nature of the service or service organisation, the perception of the organisational climate is important in clarifying the role of the service customer, and impacting on the service experience. More specifically with regard to PG research education, some researchers (Bock, Zmud, Kim & Lee, 2005) argued that it is necessary to create a climate for service that fosters the sharing of knowledge which is a necessary trait for fostering a positive research climate, where both PG research students and PG research supervisors could prosper.

The other significant relationships that exist at the 5 per cent level are OCR and PGSQUAL (p-value=0.000), SERVEXP and PGSQUAL (p-value=0.003), and PGSQUAL and SERVSAT (p-value=0.032).

In line with previous research, inter alia, Davidson (2003) which states that, even though organisations achieve their level of service quality, they must strive to achieve customer satisfaction, in this study there also seems to be a direct association between SERVSAT and PGSQUAL.

The model (Figure 2) also revealed that there are a number of insignificant relationships at the 5 per cent level, and these include Role Clarity and Service Experience (RC and SERVEXP), Role Clarity (RC) and Overall Service Satisfaction (SERVSAT), Service experience (SERVEXP) and Overall Service Satisfaction (SERVSAT), and Organisational Climate for Research and Overall Service Satisfaction (OCR and SERVSAT). The aforementioned findings confirm that the following propositions were not supported in this study: P1, P2, P6 and P9. The insignificance of the postulated relationships can be attributed to a host of scientific reasons which may include sample size, multi-collinearity or suppressor variables (Maassen & Bakker, 2001).

 

 

Several researchers, inter alia Draper and Smith (1998) and Chatterjee and Price (1991), state that multiple regression is also a good model building tool to explore relationships between an independent (y) variable against a group or set of dependent, explanatory or (x) variables. In view of the aforementioned, the data was also fitted to three different multiple regression models. The three multiple regression models were based on the proposed path analysis model in Figure 1.The relationships and dependencies between the respective variables allowed for the identification of a response/dependent variable as well as independent/explanatory variables. The 3 models are stated below:

Model 1:

PG SERVICE EXPERIENCE= β0+ β1 ROLE CLARITY+ β2*PG RESEARCH CLIMATE+ε

Model 2:

PG SERVICE QUALITY= β 0+ β 1*PG RESEARCH CLIMATE+ β 2*PG RESEARCH

EXPERIENCE +ε

Model 3:

PG SERVICE SATISFACTION β 0+ β1*ROLE CLARITY+ β2*PG RESEARCH CLIMATE+ β3*PG RESEARCH EXPERIENCE+ β4*PG SERVICE QUALITY +ε

Table 5 reflects the summary results of Model 1 which was stated as: PG SERVICE EXPERIENCE= β0+ β1 ROLE CLARITY+ β2*PG RESEARCH CLIMATE+ε.

 

 

It is evident from Table 5 that the adjusted R2 is about 18.6 per cent, implying that the Role Clarity (RC) and the PG Research Climate (OCR) account for 18.6 per cent of the variation in explaining the PG Service Experience (SERVEXP) of PG research students.

Table 6 reflects the ANOVA results for the following hypothesis:

H0: β0= β1=β2 = 0, against

Hi : at least one of the βi is not zero.

From Table 6, it is evident that, at the 5 per cent significance level, H0 is rejected, and it can thus be concluded that at least one of the multiple regression coefficients is not zero.

 

 

Table 7 reflects the estimated regression coefficient for Model 1. The fitted model is thus: PG SERVICE EXPERIENCE=2.686+ 0.345*RESEARCH CLIMATE+0.056*ROLE CLARITY.

It is evident from Table 7 that for a unit change in the Role Clarity (RC), the PG SERVICE EXPERIENCE increases by 0.056 units, as long as the other variables remain constant. Likewise a unit change in the PG Research Climate (OCR), the PG SERVICE EXPERIENCE (SERVEXP) increases by 0.345 units, as long as the other variables remain constant. This implies that as the Role Clarity (RC) and the PG Research Climate (OCR) increases so too will the PG SERVICE EXPERIENCE increase. It is also evident that the PG Research Climate (OCR) is significant at the 5 per cent level and hence, is considered to influence the PG service experience. Furthermore, it is apparent that Role Clarity (RC) is not significant at the 5 per cent level in its influence on the PG Service Experience (SERVEXP). The variance inflation factors (VIF) for in the independent variables are both less than 10, thus implying that there is no multi-collinearity present in the model.

It is evident from Table 8 that the adjusted R2 is about 26.3 per cent, implying, that the PG Research climate (OCR) and PG Research Experience (SERVEXP) account for 26.3 per cent of the variation in the PG Service Quality (PGSQUAL).

 

 

Table 9 reflects the ANOVA results for the following hypothesis:

H0: β0= β1=β2 = β3= 0; against

H1: at least one of the βi is not zero.

From Table 9, it can be concluded that at the 5 per cent level we reject H0 and, conclude that at least one of the pi is not zero.

It is evident from the coefficient estimates in Table 10 that the fitted model could be stated as follows: PG SERVICE QUALITY= 0.866+0.346*RESEARCH CLIMATE+0.318* PG SERVICE EXPERIENCE, which implies that a unit increase in the Research Climate (OCR) results in an increase of 0.346 units in the PG SERVICE QUALITY, as long as all other variables in the model remain constant. Accordingly, a unit increase in the PG Service Experience (SERVEXP) results in an increase of 0.318 units in the PG Service Quality (PGSQUAL), holding all other variables in the model constant. It is also apparent that the Research Climate (OCR) and the PG Research Experience (SERVEXP) are both significant at the 5 per cent level, implying that they influence the PG Service Quality (PGSQUAL). The variance inflation factors (VIF) for in the independent variables are both less than 10, implying that there is no multi-collinearity present in the model.

The final model that was fitted to the data was stated as: PG SERVICE SATISFACTION = β0+ β1*ROLE CLARITY+ p2*PG RESEARCH CLIMATE+ β3* PG RESEARCH EXPERIENCE+ β4*PG SERVICE QUALITY +s. The aforementioned model produced a poor adjusted R2, and a non-significant ANOVA, thus implying that the data did not support this model and the results are therefore not reported. Attempts were made to transform the dependent variable and refit the model, but to no avail. Although other models such as the generalised linear model with a Poisson link might be supported would have resulted in an unnecessary increase by the data, it was not considered since this in the length of this paper.

The findings of the multiple regression models 1 and 2 are consistent with certain of the SEM findings in that the PG Research Climate (OCR) is significant in its influence on the PG Service Experience (SERVEXP) and the PG Service Quality (PGSQUAL). The regression diagnostics were also carried out for models 1 and 2 by examining the residuals, and the results revealed that none of the assumptions of the regression were violated. Firstly, a histogram of the residuals was investigated to check whether the normality assumption of the residuals holds and, secondly, a P-P plot was looked at. The residuals were seen to have a normal curve, and in the P-P plot a flattened S-shaped curve was noted in both models.

 

7 Concluding remarks, limitations of the study and recommendations for future research

As the attention to service quality in higher education heightens, there needs to be a corresponding increase in the use of its assessment tools. This study is another step towards improving the management of a specialised and very important sector, namely post graduate research. Attempting to improve the quality of the service significantly is a major undertaking for all organisations, and this is particularly true of PG research education. Given that Higher Education Institutions are charged with the responsibility of graduating students who are well grounded and can contribute to societal development and advancement, and also (in most countries) government research funding is contingent on 'on-time' PG student completion, this paper attempted to enhance our understanding of the PG service experience, by developing and evaluating a conceptual PG research service encounter model. However, as with all research, there are a few limitations to this study.

One of the challenges associated with electronic surveys is the lack of motivation. Alternate methods of reaching the respondents should be considered, and considering that the response rate was not very satisfactory, the findings should be interpreted with caution and merely regarded as indicative and exploratory.

It must be stressed that the sample size was a limitation of the study. Future research should ideally be based on a larger and a representative sample that can adequately span the population of postgraduate research students and facilitate greater generalisation.

The study is based on the respondents' recollections of past events, and it may be possible that these are not accurate. A suggestion could be for future research to be conducted whilst the PG students are still active, and nearing the completion of their research.

 

References

ALRIDGE, S. & ROWLEY, J. 1998. Measuring customer satisfaction in higher education. Quality Assurance in Education, 6(4):197-204.         [ Links ]

ANGELL, R.J., HEFFERNAN, T.W. & MEGICKS, P. 2008. Service quality in postgraduate education. Quality Assurance in Higher Education, 16(3):236-254.         [ Links ]

BETTENCOURT, L.A. & GWINNER, K. 1996. Customization of the service experience: the role of the frontline employee. International Journal of Service Industry Management, 7(2):3-20.         [ Links ]

BITNER, M.J., FARANDA, W.T., HUBBBERT, A.R. & ZEITHAML, V.A. 1997. Customer contributions and roles in service delivery. International Journal of Service Industry Management, 8(3):193-205.         [ Links ]

BITNER, M.J., BOOMS, B.M. & MOHR, L.A. 1994. Critical service encounters: the employee's viewpoint. Journal of Marketing, 58:95-106.         [ Links ]

BOCK, G.W., ZMUD, R.W., KIM, Y.G., & LEE, J.N. 2005. Behavioural intention formation in knowledge sharing: examining the roles of extrinsic motivators, social-psychological forces, and organizational climate. MIS Quarterly, 29(1):87-111.         [ Links ]

BOLLEN, A. K. 1989. Structural equations with latent variables. New York: John Wiley.         [ Links ]

BOLTON, R.N. & DREW. J.H. 1991. A multi-stage model of customer's assessment of service quality and value. Journal of Consumer Research, 17(4):375-384.         [ Links ]

BOSHOFF, H.C. & MELS, G. 1995. Role stress and job satisfaction: the supervisory antecedents and their influence on organizational commitment. Journal of Industrial Psychology, 21(1):25-32.         [ Links ]

BYRNE, B.M. 2010. Structural equation modelling with AMOS: basic concepts, applications and programming. New York: Routledge: Taylor and Francis Group.         [ Links ]

CARRILAT, F.A., JARAMILLO, F. & MULKI. J.P. 2009. Examining the impact of service quality: a metaanalysis of empirical evidence. Journal of Marketing Theory and Practice, 17(2):95-110.         [ Links ]

CHASE, R.B. 1978. Where does the customer fit in the service operation? Harvard Business Review, 56(6): 137-142.         [ Links ]

CHATTERJEE, S. & PRICE, B. 1991. Regression analysis by example. New York: Wiley.         [ Links ]

CHONKO, L.B., HOWELL, R.D. & BELLENGER, D.N. 1986. Congruence in sales force evaluations: relations to sales force perceptions of conflict and ambiguity. Journal of Personal Selling and Sales Management, 6(1):35-49.         [ Links ]

COAKES, S. & STEED, L. 2003. SPSS Analysis without anguish. Version 11.0 for Windows. New York Miley Press.         [ Links ]

CRONBACH, L.J & MEEHL, P.E. 1955. Construct validity on psychological test. Psychological Bulletin, 52:281-302.         [ Links ]

CRONIN, J.J. & TAYLOR, S.A. 1992. Measuring service quality: a re-examination and extension. Journal of Marketing, 15(3):53-68.         [ Links ]

DANN, S. 2008. Applying services marketing principles to postgraduate supervision. Quality Assurance in Higher Education, 16(4):333-346.         [ Links ]

DAVIDSON, M.C.G. 2003. Does organizational climate add to service quality in hotels? International Journal of Contemporary Hospitality Management, 15(4):206-215.         [ Links ]

DIETZ, J., PUGH, S.D. & WILEY, J.W. 2004. Service climate effects on consumer attitudes: an examination of boundary conditions. Academy of Management Journal, 47(1):81-92.         [ Links ]

DRAPER, N.R. & SMITH, H. 1998. Applied regression analysis. New York: Wiley.         [ Links ]

DRENNAN, J. 2008. Postgraduate research experience questionnaires: reliability and factor structure with master's in nursing students. JAN Research Methodology, 487-498.         [ Links ]

GINNS, P., MARSH, H.W., BEHNIA, M., CHENG, J.H. & SCALAS, L.F. 2009. Using postgraduate students' evaluations of research experience to benchmark departments and faculties: issues and challenges. British Journal of educational Psychology, 79:577-598.         [ Links ]

GOVENDER, K.K. 1998. Managing service quality by managing the service encounter: the effects of organizational socialization strategies. Unpublished PhD thesis, University of Cape Town.         [ Links ] HAIR, M. 2006. Superqual: a tool to explore the initial expectations of PhD students and supervisors. Active Learning in Higher Education, 7(1):9-23.         [ Links ]

HARDIN, A & MARCOULIDES, G.A. 2011. A commentary on the use of formative measurement. Educational and Psychological Measurement, 71(5):753-764.         [ Links ]

HOISINGTON, S. & NAUMANN, E. 2003. The loyalty elephant. Quality Progress, February:33-41.         [ Links ]

HSIEH, A.T. & YEN, C.H. 2005. The effect of customer participation on service providers' job stress. The Service Industries Journal, 25(7):891-905.         [ Links ]

JOHNSON, R.A. & WICHERN, D.W. 2007. Applied multivariate statistical analysis. USA: Prentice Hall.         [ Links ]

KELLEY, S.W. 1987 Managing service quality: The organizational socialization of the service employee and the service customer. Unpublished DBA dissertation, Lexington, Kentucky: University of Kentucky.         [ Links ]

KLINE, T. 1994. Psychological testing. London: SAGE Publications.         [ Links ]

LE BLANC, G. & NGUYEN, N. 1997. Search for excellence in business education: an exploratory study of customers' impressions of service quality. International Journal of Educational Management, 11(2):72-29.         [ Links ]

LEVITT, T. 1972. Production-line approach to service. Harvard Business Review, 50:41-49.         [ Links ]

MAASSEN, G.H. & BAKKER, A.B. 2001. Suppressor variables in path models: definitions and interpretations. Sociological Methods and Research, 30(2):241-270.         [ Links ]

MCCULLOCH, A. 2009. The student as co-producer: learning from public administration about the student-university relationship. Studies in Higher Education, 34(2):171-183.         [ Links ]

MILLS, P.K., CHASE, R.B., & MARGULIES, N. 1983. Motivating the client/employee system as a service production strategy. Academy of Management Review, 8(2):726-735.         [ Links ]

NUNNALY, J.C. & BERNSTEIN, I.H. 1994. Psychometric theory (3rd ed.) New York: Mc Graw-Hill.         [ Links ] PARASURAMAN, A., ZEITHAML, V.A. & BERRY, L.L. 1988. SERVQUAL: A multiple item scale for measuring customer perceptions of service quality. Journal of Retailing, 64(1):12-40.         [ Links ]

SCHNEIDER, B. & BOWEN, D. 1995. Winning the service game, Boston: Harvard Business School Press.         [ Links ]

SCHNEIDER, B. WHITE, S.S. & PAUL, M.C. 1998. Linking service climate perceptions of service quality: test of a causal model. Journal of Applied Psychology, 83(20):150-163.         [ Links ]

SINGH, J., VEBREKE, E. & RHOADS, G.K., 1996. Do organizational practices matter in role stress processes? A study of direct and moderating effects for marketing-oriented boundary spanners. Journal of Marketing, 60:69-96.         [ Links ]

STEVENS, S.S. 1946. On the theory of measurement scales. Science, 103:677-680.         [ Links ]

STODNICK, M. & ROGERS. P. 2008. Using SERVQUAL to measure the quality of the classroom experience: decision sciences. Journal of Innovative Education, 6(1): 115-133.         [ Links ]

TYAGI, P.K. 1982. Perceived organizational climate and the process of salesperson motivation. Journal of Marketing Research, 19(20):240-254. Question Pro. [Online]: http//www.QuestionPro.com.         [ Links ]

 

 

Accepted: November 2012

 

 


Appendix A - Click to Enlarge

 

 


Appendix B -Click to enlarge

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License