SciELO - Scientific Electronic Library Online

 
vol.34 issue2Decolonising higher education research: from a university to a pluri-versity of approachesLecturer conceptions of and approaches to decolonisation of curricula in higher education author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


South African Journal of Higher Education

On-line version ISSN 1753-5913

S. Afr. J. High. Educ. vol.34 n.2 Stellenbosch  2020

http://dx.doi.org/10.20853/34-2-3662 

GENERAL ARTICLES

 

Institutional barriers to learning in the South African open distance learning context

 

 

E. O. MashileI; A.FynnII; M. MatoaneIII

ITuition Support and Facilitation of Learning, University of South Africa Pretoria, South Africa e-mail: mashieo@unisa.ac.za / http://orcid.org/0000-0001-7234-9031
IIDepartment of Institutional Research and Business Intelligence , University of South Africa Pretoria, South Africae-mail: fynna@unisa.ac.za / http://orcid.org/0000-0002-0480-8926
IIIDirectorate: Instructional Support and Services , University of South Africa Pretoria, South Africa e-mail: matoamc@unisa.ac.za

 

 


ABSTRACT

The discourses surrounding student success and risk have shifted emphasis from the concept of risk as residing with student attributes, aptitudes and socio-economic history to the concept of contextualized risk, which is a result of the interaction between student, institution and the broader higher education context. This shift in discourse has led to a change in understanding the role that institutions play in "creating" at-risk students through institutional culture, procedures, policies and assumptions about the nature of teaching and learning. Institutional barriers to learning have received relatively little attention in the scholarship of teaching and learning when compared to the wider body of research on student-bound risk factors. Institutional barriers to learning refer to those institutional characteristics that, when combined with the attributes of the student body, create inadvertent barriers to successful study completion. This article makes use of frameworks derived from prior research into the barriers to learning and established models of student support to identify what students perceive as institutionally embedded barriers to teaching and learning in an institution in the South African open, distance and e-learning context.

Keywords: institutional barriers, ODL, barriers to learning, student support, distance education, factor analysis


 

 

INTRODUCTION

Traditional higher education practices and environments are often not based on the needs of present-day students which may lead to insufficient commitment and student withdrawal (Machika 2013). Machika (2013) takes the stance that institutions must develop an understanding of students' individual and collective needs to create an environment conducive to achieving academic success. While a simple principle in the abstract, the diverse needs of the student population in higher education poses a challenge for the alignment of institutional and student needs (Machika 2013). The preceding statement does not problematise student diversity, rather it attempts to draw attention to how the universalist assumptions about student needs lead to the development of single systems that exclude or marginalise students who do not fit the underlying assumptions.

Numerous authors such as Tinto (1975; 1982; 2010), Swail (1995; 2007), Swail, Redd and Perna (2003), Astin (1975; 1984; 1997), Sweet (1986), Bean (1980) and Subotzky and Prinsloo (2009; 2011) acknowledge student success as a mutually constitutive process between student and institution. Student success is therefore seen as lying at the nexus of interaction between the student and the institution, each with their own characteristics, history, culture and practices (Subotzky and Prinsloo 2011). Institutional factors therefore play an integral role in student success; and, by implication, student risk. While this statement appears tautologous at first glance, embedded within it is the implication that institutions must examine their practices, policies, procedures and culture to differentiate between areas where the risk to student success rests with the institution and those that rest with the student (Subotzky and Prinsloo 2011).

Given the foregoing, this article aims at investigating perceptions of students with what they regard as barriers in their higher education studies in an ODL institution. This article is part of a larger project on barriers to teaching and learning barometer studies in ODL contexts and concomitant student support interventions to mediate the negative impacts of barriers. It reports on the first phase of the project and focuses on measuring distance education students' perceptions on the primary barriers to teaching and learning in a large ODL institution in South Africa. The article reports on the development and testing of a "Distance Education Barriers to Learning Instrument (DEBLI)" and key teaching and learning barriers as perceived by students in a large South African ODL institution.

 

BACKGROUND

The literature on barriers to teaching and learning in distance education is dominated by research conducted in dual mode distance education institutions (Guri-rosenblit 2012). There is a dearth of studies focusing on single mode distance education institutions. Furthermore, barriers to teaching and learning in distance education have been studied mostly from the perspective of teaching staff, administrators and managers (Muilenburg and Berge 2005). These studies are somewhat biased to views of faculty or administrators. There is thus a need to obtain perceptions of barriers to teaching and learning in distance education from a student perspective (Berge, Muilenburg and Haneghan 2002). This article therefore focuses on understanding barriers to teaching and learning from the perspective of students studying at a dedicated distance education institution in South Africa.

South Africa is a developing country that needs a skilled populace for economic development. Post-school education is thus of importance. Universities are the main producers of a skilled labour force because the other post-school sectors are under-developed. Furthermore, there is only a limited number of public universities in the country (26). Public higher education is funded by government and the amount of money made available limits the number of spaces and universities that could be provided. To compound the dire state of higher education, the bulk of students who can attend higher education come from lower socioeconomic backgrounds and need government bursaries and loans. To this end distance education is regarded as a mechanism to increase participation rates in higher education (Brown et al. 2013) without the concomitant increase in costs. Baijnath (2018) cites a 2017 Department of Higher Education and Training report that indicates that, of the total of all students who had enrolled for Higher Education, 379 732 (constituting 38.5%) were studying at a distance. Funding for distance education is at 50 per cent of a full-time equivalent student with moves to lower this even further.

Studies on distance education have shown that several factors are needed for the successful provision of education at a distance (Sayed and Baker 2014a) and the role played by technology in this regard has been widely acknowledged (Anderson and Dron 2011). In the current dispensation, ICT infrastructure is key for distance education (Elkaseh, Wong and Fung 2015; Guri-rosenblit 2012; Tait 2018; Tait 2014). Institutional technological capabilities (infrastructure, bandwidth, technical support) are necessary to support electronically mediated distance education learning programmes (Naveed et al. 2017; Sayed and Baker 2014b). The network infrastructure in South Africa, on the other hand, is not equitably developed and access to broadband is highly expensive. Fibre is available mostly in selected urban areas with most broad-band coverage coming from cell phone providers. The distance education institution, on which the current study is based, has partnered with major cell phone providers to allow zero costs associated with students visiting educational sites through the internet. The institution also partnered with a range of Digital Access Centres [which are computer centres within communities] where students can use their computer and internet facilities at no cost to them. Distance education in such an environment is thus constrained and institutions need to weigh several factors to determine the mode of delivery and the kind of student support provided.

The context in the foregoing illustrate the need to understand barriers to teaching and learning in distance education from the perspective of students. Students enrol in distance education since there is limited space at other higher education institutions in the country. Although distance education caters for the more mature and self-directed learner (Bol and Garner 2011), new entrants to the institution are increasingly fresh from school (18-24-year cohort) and from a schooling sector that is still reeling from the effects of apartheid. The conceptual model used in this study thus incorporates all these contextual factors but does not include factors that would not be critical in such circumstances.

The conceptual framework for this article is based on a three-dimensional model of student support that was developed by Swail (2007). This model is premised on a triadic relationship between cognitive (academia), social (personal) and institutional (administrative process) factors that are found both within the student and at the institution.

The three-dimensional model views student support as central to student success and was developed as an interrelated set of interactions between the institution and student. In this article we use the three-dimensional model to inform the underlying links between the individual institutional processes that may emerge as barriers. From a range of barriers reported in previous studies, we selected only variables that flow from the conceptual model of this study and that have a bearing on students. The resultant instrument, called the Distance Education Barriers to Learning Instrument (DEBLI) was made up of 53 Likert scale items measuring institutional culture and pedagogy; course and qualification relevance; communication with the institution; student support; and technological access.

 

RESEARCH METHOD

This article employs a quantitative design to provide a broad description of the presence, prevalence and perception of various barriers to teaching and learning on students. Exploratory factor analysis methods were used to determine institutional latent variables that are perceived by students as barriers to their learning. Exploratory factor analysis enabled us to provide broad descriptions of institutional barriers experienced by students during their studies in a distance education context. Descriptive approaches are most appropriate for studying phenomena where there is little pre-existing knowledge or evidence, explaining the nature and mechanisms of the phenomenon. While there have been international studies on barriers to learning in ODL institutions (Berge et al. 2002; Cho and Berge 2002; Muilenburg and Berge 2005) relatively little work has been done in the ODL context in South Africa. Therefore, a descriptive approach provides the most appropriate basis to expand this area of inquiry in the South African context.

Sampling involved targeting approximately 10-20 per cent of the student population for inclusion in the first phase of the barriers project. A different set of students would be targeted for other phases of the project to avoid student fatigue to survey administrations across the university. Data was collected in two separate academic periods during the course of 2018. In total, 17 322 emails were sent to students in the first semester for participation. The initial invitation was followed up by reminders once a week for the period of the project, excluding examination periods. Of the 17 322 invitations, 653 surveys were started and 365 (56%) recorded as complete or partially complete. In the second semester 45 352 students were sent the invitation for participation with weekly follow-ups to improve response rates. For the second semester, 1 014 surveys were started and 835 (63%) responses recorded as complete or partially complete. The responses from both semesters were combined into a single dataset to create a response set of 1 338 from a sample of 60 864 invitations sent, which is a 2.2 per cent response rate. While the total responses are sufficient for statistical analyses of the pilot, future iterations of the project will need to improve recruitment techniques to ensure the response set is more representative.

 

DATA ANALYSIS

The data from the DEBLI was analysed with the SPSS computer program. The data obtained from the 53 Likert items collected on perceptions of barriers did not satisfy multivariate normality. Based on the characteristics of the collected dataset, we used the factor analysis strategy advocated by Osborne (2014), namely a Principal Axis Factor extraction method with oblique rotation (direct oblimin). The scree plot was then used to determine the number of factors to retain. Multiple runs, including the number obtained from the scree plot (7), the a priori dimensions (5) and numbers below and above these provided information on the number of meaningful factors in the dataset. The loading of items on factors was obtained from the pattern matrix after scrutinizing the structure matrix for possible correlations. Thereafter the internal consistency of the various scales were determined.

 

RESULTS

Factors underlying student perceptions of barriers

To determine the factorability of the DEBLI dataset we used the Kaiser-Meyer-Olkin measure of sampling adequacy (MSA) and Bartlett's test of sphericity (ToS). The MSA is a measure of the extent a variable belongs to a set of variables and has a value between 0 and 1. The MSA for the DEBLI dataset exceeded the meritorious level of 0.8 (Costelo and Osborne 2005) and yielded 0.924 while the ToS was statistically significant at p < 0.001. These indicators provide evidence that the DEBLI is suitable for use (Dzubian and Shirkey 1974).

The data from the respondents yielded seven factors all with initial eigen values greater than 1 (see Table 1).

Variables with significant loadings smaller than 0.32 were deleted. An item with a 10 per cent overlapping variance with other items in that factor has a loading of 0.32; this value therefore constitutes a good rule of thumb to determine minimum loadings (Osborne 2014). The deleted items were: structure of modules; combined module workload during semester; teaching approach at institution (ODL learning); language level used in study material; clarity on the final year concession process; the qualification clearly links to career opportunities; clear procedures for module practicals; easy access to sites for module practicals and scheduling for examinations.

Factor 1 - clarity around examinations

Factor 1 consists of five items with an alpha reliability of 0.776. The items loading on this factor are: the examination type relative to content (0.609); time allocations for examination (0.446); assignment link to examinations clearly (0.490); examination scope is clearly stated (0.373) and clarity of module outcomes (expectations) (0.325). These items deal with issues of clarity with examinations.

The clarity around examinations factor accounted for the most variance in the DEBLI dataset. Distance education students are assumed to be geographically dispersed from the institution and could connect with the institution through at least two-way communication technologies (Borokhovski et al.). Depending on how DE in a specific context is designed, students may have limited real-time opportunities to engage with instructors and peers to obtain clarity on important aspects of their courses (Burns, Cunningham and Foran-Mulcahy 2014). Therefore, these aspects must be explicitly communicated in the study material and embedded into student support activities. In other words, the role traditionally played by teacher - student interaction needs to be diffused into other interaction modes.

Interaction plays a critical role in supporting and even defining education (Anderson 2003; Xiao 2017). In formal distance education, students interact with course content, instructors and peers. The mix of interactions vary from one DE provider to another. This mix may also differ from one learning programme to another in the same DE provider. Getting the mix right is therefore an important consideration in course design. However, if all three interaction modes are present, they provide a more satisfying educational experience (Anderson 2003). In contexts where student-student interaction is not a key feature of the design the other two modes of interaction assume a prominent role. Andrews et al. (2011) reports that most teaching staff prefer to use institutional technologies for the purposes of delivering information rather than harnessing interactive aspects that may support student-student interaction. Consequently, optimisation of communication with students through study material and instructors assumes an elevated role (Irani, Scherler, Harrington and Telg 2001).

Clarity on examination procedures, content and processes is paramount to success for DE students, since they often have to study in the midst of life challenges like family, financial and social pressures. In situations where examinations at the end of a course mainly determine overall success, the activity becomes high stake and of much concern to students. Selwyn (2001, 93) calls this "the competitive aspects of studying and producing assessed work". Students using the competitive aspect usually approach studying in relation to pass rates and final grades and are less concerned about other aspects of a course. The lack of clarity around examinations in such contexts may become a major barrier to learning for students. The results of this study provide support to the importance of adequate communication between the instructor and students on matters related to a course; in this instance on matters relating to examinations. Respondents in this study therefore viewed inadequate communication and interaction as a barrier in their learning and success at the institution.

Factor 2 - instructor support with course content

Factor 2, named instructor support with course content, consist of seven items with an alpha reliability of 0.845. Items loading on this factor are: tutor availability for content clarification (0.536); lecturer availability for content clarification (0.594); quality of responses from tutors on module queries (0.641); quality of responses from lecturers on module queries (0.690); clarity of assignment feedback (0.545); timely assignment feedback (0.457) and quality of feedback given on assignments (0.627).

In DE students spend inordinate amounts of time interacting with content and where there are areas of misunderstanding and they need instructors to help resolve their specific issues.

Instruction in DE is conducted differently by various institutions. In the Open University United Kingdom, academics develop the study materials and do not interact with students while tutors are responsible for facilitating students' learning (Goold, Coldwell and Craig 2010; Mills 2008). This system is similar at Athabasca University in Canada (Annand, Michalczuk and Thiessen 2009; Ives and Pringle 2013). In the South African institution under study, academics develop the study material and lead teaching and learning activities. E-tutors supplement the activities of academics by facilitating learning in smaller groups of students (Aluko and Hendrikz 2012; Matoane and Mashile 2013). Availability of instructors, either tutors or lecturers, to deal with specific aspects of a course is critical since it provides a mechanism to mediate learning. The lack of instructor availability for interaction with DE students could thus be a barrier to their learning.

The adequacy of instructor support with course content goes beyond mere availability to the quality of the support provided. Since most DE provision adopts an asynchronous delivery mode (Tait 2014), real-time clarification of inquiries is limited; therefore, the initial response to an inquiry has an elevated status from the student's perspective. If students perceive the quality of responses as inferior, they may try to find other forums to resolve their inquiries or end up with unresolved inquiries that may impact negatively on their studies. In a meta-analysis study conducted by Means et al. (2013) it is reported that DE students who were involved in collaborative interactive learning and instructor-directed expository instructional conditions performed better than those engaged only with active self-study. The results of this study underscore the finding that DE students who do not have adequate support from instructors may experience this as a barrier in their learning.

DE courses may have a defined (limited) time period (teaching time), specifically where semester models are involved. Communication between students and instructors thus need to occur in a timely fashion. Since assignments are often used as formative assessment in DE and contribute to a student's progress in the course, the speed of feedback is important (Lee et al. 2011). Several assignments must be completed in one semester and feedback about student performance in previous assessments feed into subsequent work. If assignment feedback is not received in time, assessment for learning becomes compromised and this may be a barrier towards students' learning.

Factor 3 - Educational Technology (Learning Management System)

Factor 3 consists of five items with an alpha reliability of 0.804. Items loading on this factor are: retrieving qualification information from institutional webpage (0.726); retrieving support information from institutional webpage (0.677); usability (ease of use) of students' portal (0.648); support provided in students' portal (resetting passwords, technical) (0.558) and availability of the students' portal (offline or inaccessible) (0.579). These items relate to the educational technology used by the institution for its students' online portal. The portal provides the student with the learning management system (LMS), their grades, student support activities and related student services and accounts.

Educational technology constitutes the backbone of ODL provision (Selwyn 2011). In a developing country, the suitability of the infrastructure supporting teaching and learning is regarded as a significant barrier in the implementation of ODL (Elkaseh et al. 2015). Technological barriers arise when bandwidth capacity (Tedre, Apiola and Cronje 2011) limits availability of the LMS; when the interface (software) is cumbersome to use (Granic and Cukusic 2007); and students do not receive appropriate support to interact with the technology (Cho 2012).

Bandwidth availability in South Africa is unequal with the best coverage in the major urban areas only. Institutional capacity to acquire and maintain data centres that cater for hundreds of thousands of student activity is limited and this impacts on the availability of the LMS during critical cycles of the academic year. Non-availability of technological infrastructure in ODL contexts is thus an important barrier to students' learning.

The usability of the software used by ODL institutions is also a barrier to students' learning. Ease of access of institutional information on the LMS would thus aid the experience of students' learning. Most students are of late familiar with new media like social networking applications and mobile technologies (Andrews et al. 2011). Their expectation of the usability of an LMS could thus be shaped by how the interactive aspects are designed.

The institution under consideration enrols students who come from different socioeconomic statuses with various levels of familiarity with computers and associated new media. DE provision that places high emphasis on transacting with students and make use of the LMS must ensure that students are properly supported in using the platform for the purpose of teaching and learning. Previous studies indicate that students may have difficulty in knowing how to harness educational technologies for teaching and learning purposes even though they may have high levels of literacy in the use of social media applications (Williams 2011, in Andrews 2011; Brown et al. 2013).

The results of our study highlight the importance of creating a conducive environment for teaching and learning, a factor that has been found essential in mitigating barriers to learning. Factor three points to the importance of structuring the website for optimum ease of access and having a reliable LMS which contains clear and accessible student support information.

Factor 4 - access to technology

Factor 4, named Access to Technology, consists of four items with an alpha reliability of 0.876. Items loading on this factor are: cost of data for studying online (0.906); cost of data for downloading study material (0.854); access to a quality internet connection (0.754) and access to a computer for studying (0.613).

ODL institutions rely on ICTs to support teaching and learning (Mashile and Matoane 2016; Tait 2014). Previous meta-analysis studies indicate that "technology has an overall positive impact on learning" (Bernard et al. 2014). According to Liebenberg, Chetty and Prinsloo (2012), various technologies (television, video, print, etc.) may be used for teaching and learning; and are thus ascribed as educational technologies. The development of educational technology has also impacted on the kinds of pedagogies used in DE. Garrison, Anderson and Archer (2000) indicate for example that the introduction of two-way communication technologies has supported (social) constructivist pedagogies and increased the number of courses using collaborative learning among students. Course designs are of late integrating several educational technologies that help achieve learning outcomes akin to competencies needed to function in the 21st century knowledge economy. As such there is an increase in courses offered either through blended or online learning.

Course design in DE increasingly makes use of web-based or web-supported teaching and learning strategies. In a developing country where the cost of access to the internet is prohibitive, access to computers (devices) and connectivity become limiting factors for successful teaching and learning (Mills 2008). South Africa is a country with high levels of inequality, as measured through the gini-coefficient. Data costs are among the highest in the world and are unaffordable for students, specifically those who rely on government bursaries to finance their education. Costs for devices and connectivity are thus a significant barrier for various students.

Factor 5 - communication with administrative staff

Factor 5, named Communication with administrative staff, with emphasis on communication that resolves student queries, consist of six items with an alpha reliability of 0.854. Items loading on this factor are: resolving communication on finance queries (0.637); resolving communication on registration queries (0.568); assistance with deferring examinations (0.688); resolving communication on examination results (0.598); office hours for administrative staff (0.515) and assistance with transferring credits (0.557).

There are many inter-related components in the DE system. As such there may be challenges associated with service delivery in any one of these components. A service-oriented view that treats students as customers is sometimes necessary to ensure that the various student support structures are sensitive to the needs of students (Mills 2008). DE institutions need to invest in infrastructure and human resources to address student related inquiries sufficiently and timeously to mitigate against student dropout (Simpson 2003, in Mills 2008). Mills (2008) underscores the importance of customer relationship management systems as necessary in the support of students studying through DE. Athabasca University uses a call centre in one of its faculties to support students, including resolving issues they confront speedily (Annand et al. 2009). Thus, lack of or ineffective structures to address student enquiries is a barrier to student success.

Factor 6 - access to student support services

Factor 6, named access to student support services, consists of 13 items with an alpha reliability of 0.932. Items loading on this factor are: schedules for regional orientation to institution (-0.601); distance to regional centres for study support (-0.593); schedules for face-to-face tutorials (-0.592); office hours in regional centres for study support (-0.583); access to counselling services (-0.554); lack of clarity on support services (-0.552), availability of institutional computer facilities (-0.541); access to computer skills training (-0.513), awareness of support services (-0.500); clarity on how to access student support services (-0.499), access to reading and writing support (-0.486); access to Unisa telecentres (-0.482) and access to numeracy (mathematics) support (-0.429). All the items in this factor had negative loadings and thus the sign was reversed (Jolliffe and Bartholomew 2006).

Student support refers to all interactive activities and services employed by an institution to help students meet their learning needs and attain learning outcomes (Brindley, Walti and Zawacki-richter 2008; Lee et al. 2011). Support activities span from the time students inquire about possible learning programmes through teaching and tutorial activities to post-graduation. The kind of support services include tutoring, counselling, orientation to ODL, application and registration, infrastructure for delivery of services such as regional offices, remedial activities to support students with higher education studies such as academic writing and numeracy support and a range of other services depending on the educational environment of the institution.

Lack of quality support (Aluko and Hendrikz 2012; Lee et al. 2011) and services impacts on course completion times and retention and thus constitutes a barrier in DE. The importance of student support in distance education is well-documented (Tait 2003 in Brindley et al. 2008; Tait 2014).

Factor 7 - qualification issues

Factor 7, named qualification issues, consists of four items with an alpha reliability of 0.775. Items loading on this factor are: the rules for completing my qualification are easy to understand (-0.683); the modules prescribed for qualification are relevant (-0.455); the rules for selecting modules are clear (-0.697) and minimum credit requirements per year (academic exclusion) (-0.325).

Following from the discussions earlier on the importance of clear communication of content, assessment and student support, rules pertaining to the curriculum structure and progression are equally critical. Students who do not understand these rules may take too long to graduate since they have taken unnecessary combinations of courses. This renders lack of clear rules on curriculum structure and progression a barrier to success in DE.

Perceptions of barriers experienced by subgroups

Factor scores were calculated to determine the perceptions of subgroups in this study. "Factor scores are the composite (latent) scores for each subject on each factor ... and are useful for conversion of large sets of measured variables into a smaller set of composite constructs for further inquiry" (Odum 2011, 5). We used the regression method of computing factor scores, since it provides more exact standardized scores and maximizes validity (Distefano, Zhu and Mîndrila 2009).

Descriptive data were analysed to determine the overall priority of student barriers (see Table 2). The information in this table is made up of standardized z-scores; therefore, the medians are interpreted (Distefano et al. 2009). The DEBLI scales rate 1 as no barrier at all while 5 represents a consistent barrier. The most severe barrier factor will have the highest median. The factor which is perceived by the respondents as the most severe barrier is factor 7 which deals with qualification issues. The factor perceived to constitute the least barrier is factor 3 which deals with educational technology.

Factor scores in further analysis must be used with great care (Distefano et al. 2009). Consequently, the data was screened to ensure that it meets the assumptions of the further statistics to be used. The factor scores data did not meet the assumptions underlying the use of linear regression (Zuccaro 2010); therefore, non-parametric measures were used. After inspecting the nature of the factor score variables and the categorical nature of the grouping variables, Kruskal-Wallis tests were computed. Kruskal-Wallis tests were used to determine differences in factor scores based on gender, race, employment status and college affiliation (see Table 3). Gender did not significantly affect all factor scores of the DEBLI dataset. Race significantly affected factors 2, 3, 4, 5 and 6. Employment status only significantly affected factor 4 and college affiliation affected only factors one and two.

 

DISCUSSION

The analysis yielded several interesting points for consideration regarding barriers to learning in DE in the South African context. While clarity around examinations explained the largest proportion of the variance (25%), the factor that represented the most severe barrier was qualification issues. Within the context of the institution, the latter finding may explain the disjuncture between the relatively high course success rates demonstrated within individual courses and the low throughput rate in qualifications. While there have been substantial studies exploring the various demographic factors that impact on student success (Beaudoin and Kumar 2012; De Hart and Venter 2013; Stewart, Lim and Kim 2015; Yasmin 2013), there has been little focus on the underlying capacity of students to appropriately select and sequence courses to ensure optimal progression through the institutional academic rules. Within modular curricula students can self-pace their learning by combining modules in a structured qualification framework. However, where the rules for selecting or combining individual courses are formulated in technical jargon or altered without effective communication on the implications of the change, students may make uninformed decisions on the specific sequencing; and, consequently, the scaffolding of their learning within the broader qualification. We argue that the findings of this study, in combination with the chronically low throughput experienced at the institution, indicate that student success is not a linear progression from one course to the next. Rather, the implicit assumptions and curriculum knowledge of how qualifications are comprised need to be made explicit and accessible to students in a manner that effectively inducts students into the disciplinary discourses embedded in the qualification design. This may be done by foundation modules which focus on orienting students to the epistemological assumptions embedded in the discipline and the expected learning transfer between courses. Another avenue may be a curated recommender system which provides students with a simple but comprehensive guideline to sequencing their course selections to ensure successful qualification completion.

While this study provides an understanding of barriers that are specific to studying within an Open Distance Education institution that is located in a developing country, these barriers can equally apply to other similar contexts. Once all the stages of the larger project have been completed, it is hoped that the research project's findings would allow for greater levels of confidence in generalising its findings. This research is a first step towards closing the gap that existed in literature on barriers to learning within Higher Education, which heavily relied on data from contact institutions. Additionally, the findings of this research give a voice to students' perceptions of barriers that impact on their own success and throughput, adding to the voices of lecturers and practitioners in the field.

 

REFERENCES

Aluko, R. and J. Hendrikz. 2012. Supporting distance education students : The pilot study of a tutorial model and its impact on students' performance. Progressio 34(2): 68-83.         [ Links ]

Anderson, T. 2003. Getting the mix right again: An updated and theoretical rationale for interaction. International Review of Research in Open and Distance Learning 4(2). https://doi.org/10.19173/irrodl.v4i2.149        [ Links ]

Anderson, T. and J. Dron. 2011. Education pedagogy. International Review of of Research in Open and Distance Learning 12(3): 80-97.         [ Links ]

Andrews, T., B. Davidson, A. Hill, D. Sloane and L. Woodhouse. 2011. Using students' own mobile technologies to support clinical competency development in speech pathology. In Models for interdisciplinary mobile learning: Delivering information to students, ed. A. Kitchenham, 247264. Hershey, PA: Information Science Reference.         [ Links ]

Annand, D. G., K. L. Michalczuk and J. K. Thiessen. 2009. Evaluating the relative efficiencies and effectiveness of the contact centre and tutor models of learner support at Athabasca University. Academic and Professional Development Fund Report 2009-2010. Athabasca University Library & Scholarly Resources.         [ Links ]

Astin, A. W. 1975. Preventing students from dropping out. San Francisco, CA: Jossey-Bass.         [ Links ]

Astin, A. W. 1984. Student involvement: A developmental theory for higher education student involvement. Journal of College Student Development 25: 297-308.         [ Links ]

Astin, A. W. 1997. How "good" is your institution's retention rate? Research in Higher Education 38(6): 647-658.         [ Links ]

Baijnath, N. 2018. Learning for development in the context of South Africa: Considerations for open education resources in improving higher education outcomes. Journal of Learning for Development 5(2): 87-100.         [ Links ]

Bean, J. P. 1980. Dropouts and turnover: The synthesis and test of a causal model of student attrition. Research in Higher Education 12(2): 155-187. https://doi.org/10.1007/BF00976194        [ Links ]

Beaudoin, B. and P. Kumar. 2012. Using data to identify at-risk students and develop retention strategies. Washington, DC.         [ Links ]

Berge, Z. L., L. Y. Muilenburg and J. Haneghan. 2002. Barriers to distance education and training: Survey results. The Quarterly Review of Distance Education 3(4): 409-418.         [ Links ]

Bol, L. and J. K. Garner. 2011. Challenges in supporting self-regulation in distance education environments. Journal of Computing in Higher Education 23(2-3): 104-123. https://doi.org/10.1007/s12528-011-9046-7        [ Links ]

Borokhovski, E., P. C. Abrami, R. M. Bernard, R. F. Schmid and R. M. Tamim. 2014. A meta-analysis of blended learning and technology use in higher education: from the general to the applied. Journal of Computing in Higher Education 26(1): 87-122. https://doi.org/10.1007/s12528-013-9077-3        [ Links ]

Brindley, J. E., C. Walti and O. Zawacki-Richter. 2008. Learner support in open, distance and online learning environments. In BIS. https://doi.org/10.1080/0158791        [ Links ]

Brown, M., H. Hughes, M. Keppell, N. Hard and L. Smith. 2013. In their own words : Student stories of seeking learning support. Open Praxis 5(4): 345-354. https://doi.org/10.5944/openpraxis.5.4.87        [ Links ]

Burns, S., J. Cunningham and K. Foran-Mulcahy. 2014. Asynchronous online instruction: Creative collaboration for virtual support. CEA Critic 76(1): 114-131.         [ Links ]

Cho, M. H. 2012. Online student orientation in higher education: A developmental study. Educational Technology Research and Development 60(6): 1051-1069. https://doi.org/10.1007/s11423-012-9271-4        [ Links ]

Cho, S. K. and Z. L. Berge. 2002. Overcoming barriers to distance training and education. Education at a Distance [USDLA Journal] 16(1): 1-12.         [ Links ]

Costelo, A. B. and J. Osborne. 2005. Best practices in exploratory factor analysis: Four recommendations for getting the most from your analysis. Practical. Practical Assessment Research & Evaluation 10(7): 1-9. https://doi.org/10.1234/2013/999990        [ Links ]

De Hart, K. and J. Venter. 2013. Comparison of urban and rural dropout rates of distance students University of South Africa Research method. Perspectives in Education 31(1): 66-76.         [ Links ]

Distefano, C., M. Zhu and D. Mindrila. 2009. Understanding and using factor scores: Considerations for the applied researcher. Practical Assessment, Research & Evaluation 14(20): 1-11. https://doi.org/10.1.1.460.8553        [ Links ]

Dzubian, C. D. and E. C. Shirkey. 1974. When is a correlation matrix appropriate for factor analysis? Psychological Bulletin 81(6): 358-361.         [ Links ]

Elkaseh, A., K. W. Wong and C. C. Fung. 2015. A review of the critical success factors of implementing e-learning in higher education 22(2).         [ Links ]

Garrison, D. R., T. Anderson and W. Archer. 2000. Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2-3: 87-105. https://doi.org/10.1016/S1096-7516(00)00016-6        [ Links ]

Goold, A., J. Coldwell and A. Craig. 2010. An examination of the role of the e-tutor. Australasian Journal of Educational Technology. https://doi.org/10.14742/ajet.1060        [ Links ]

Granic, A. and M. CukuSic. 2007. Universal design within the context of e-learning. Access: 617-626.         [ Links ]

Guri-rosenblit, S. 2012. Open/distance teaching universities worldwide: Current challenges and future prospects. Magazyn Edukacji Eektronicznej 2(4): 4-12. http://wyrwidab.come.uw.edu.pl/ojs/index.php/eduakcja/article/viewFile/80/50        [ Links ]

Irani, T., C. Scherler, M. Harrington and R. Telg. 2001. Overcoming barriers to learning in distance education: The effects of personality type and course perceptions on student performance. National Agricultural Education Research Conference, 13 December 2004.         [ Links ]

Ives, C. and M. M. Pringle. 2013. Moving to open educational resources at Athabasca University: A case study. International Review of Research in Open and Distance Learning. https://doi.org/10.19173/irrodl.v14i2.1534        [ Links ]

Jolliffe, I. and D. J. Bartholomew. 2006. Latent variable models and factor analysis. In Applied Statistics Vol. 38. https://doi.org/10.2307/2347739        [ Links ]

Lee, S. J., S. Srinivasan, T. Trail, D. Lewis and S. Lopez. 2011. Examining the relationship among student perception of support, course satisfaction, and learning outcomes in online learning. Internet and Higher Education 14(3): 158-163. https://doi.org/10.1016/j.iheduc.2011.04.001        [ Links ]

Liebenberg, H., Y. Chetty and P. Prinsloo. 2012. Student access to and skills in using technology in an open and distance learning context. The International Review of Research in Open and Distance Learning 13(4): 250-268. https://doi.org/10.19173/irrodl.v13i4.1303        [ Links ]

Machika, P. 2013. The alignment of institutional and student commitment to student needs. Progressio 35(1): 91-103.         [ Links ]

Mashile, E. O. and M. C. Matoane. 2016. Leadership in ODL institutions: An Ubuntu perspective. In Open Distance Learning through the philosophy of Ubuntu, 108-123. New York: Nova Science Publishers, Inc.         [ Links ]

Matoane, M. C. and E. O. Mashile. 2013. Key considerations for successful e-tutoring : Lessons learnt from an institution of higher learning in South Africa. In Proceedings of E-Learn: World Conference on E-Learning in Corporate, Government, Healthcare, and Higher Education 2013, ed. T. Bastiaens and G. Marks, 863-871.         [ Links ]

Means, B., Y. Toyama, R. Murphy and M. Baki. 2013. Effectiveness of online and blended learning. Teachers College Record 115(030303): 1-47. https://doi.org/10.3991/ijac.v3i2.1322        [ Links ]

Mills, R. 2008. Looking back, looking forward: What have we learned? In Learner support in open, distance and online learning environments, 29-38.         [ Links ]

Muilenburg, L. Y. and Z. L. Berge. 2005. Student barriers to online learning: A factor analytic study. Distance Education 26(1): 29-48. https://doi.org/10.1080/01587910500081269        [ Links ]

Naveed, Q. N., A. Muhammed, S. Sanober, M. R. N. Qureshi and A. Shah. 2017. Barriers effecting successful implementation of E-learning in Saudi Arabian Universities. International Journal of Emerging Technologies in Learning 12(6): 94-107. https://doi.org/10.3991/ijet.v12i06.7003        [ Links ]

Odum, M. 2011. Factor scores, structure and communality coefficients: A primer. Annual Meeting of the Southwest Educational Research Association. http://ir.obihiro.ac.jp/dspace/handle/10322/3933        [ Links ]

Osborne, J. 2014. Best practices in exploratory factor analysis. In Best practices in quantitative methods, ed. J. Osborne, A. B. Costello and J. T. Kellow, 86-99. https://doi.org/10.4135/9781412995627.d8        [ Links ]

Sayed, M. and F. Baker. 2014a. Blended learning barriers: An investigation, exposition and solutions. Journal of Education and Practice 5(6): 81-85. http://iiste.org/Journals/index.php/JEP/article/view/11212        [ Links ]

Sayed, M. and F. Baker. 2014b. Blended learning barriers: An investigation, exposition and solutions. Journal of Education and Practice 5(6): 81-85.         [ Links ]

Selwyn, N. 2011. Digitally distanced learning: A study of international distance learners' (non)use of technology. Distance Education 32(1): 85-99. https://doi.org/10.1080/01587919.2011.565500        [ Links ]

Stewart, S., D. H. Lim and J. Kim. 2015. Factors influencing college persistence for first-time students. Journal of Developmental Education 38(3): 12-20.         [ Links ]

Subotzky, G. and P. Prinsloo. 2009. Towards a conceptual model of retention and success in distance education: The case of the University of South Africa. NADEOSA Annual Conference. Pretoria, South Africa, 17-18 August.         [ Links ]

Subotzky, G. and P. Prinsloo. 2011. Turning the tide: A socio-critical model and framework for improving student success in open distance learning at the University of South Africa. Distance Education 32(2): 177-193. https://doi.org/10.1080/01587919.2011.584846        [ Links ]

Swail, W. S. 1995. The development of a conceptual framework to increase student retention in science, engineering and mathematics programms at minority institutions of higher education. George Washington University.         [ Links ]

Swail, W. S. 2007. The art of student retention. (Vol. 1). Austin, Texas: Educational Policy Institute.         [ Links ]

Swail, W. S., K. E. Redd and L. W. Perna. 2003. Retaining minority students in a framework for success. In ASHE-ERIC Higher Education Report (Vol. 30). https://doi.org/10.1002/aehe.3002        [ Links ]

Sweet, R. 1986. Student dropout in distance education: An application of Tinto's model. Distance Education 7(2): 201-213. https://doi.org/10.1080/0158791860070204        [ Links ]

Tait, A. 2018. Education for Development: From Distance to Open Education. Journal of Learning for Development 5(2): 101-115. http://www.jl4d.org/index.php/ejl4d/article/view/294/313        [ Links ]

Tait, A. W. 2014. From place to virtual space: Reconfiguring student support for distance and e-learning in the digital age. Open Praxis 6(1): 5-16. https://doi.org/10.5944/openpraxis.6.1.102        [ Links ]

Tedre, M., M. Apiola and J. C. Cronje. 2011. Towards a systemic view of educational technology in developing regions. IEEE AFRICON Conference. https://doi.org/10.1109/AFRCON.2011.6072012        [ Links ]

Tinto, V. 1975. Dropout from higher education : A theoretical synthesis of recent research. Review of Educational Research 45(1): 89-125.         [ Links ]

Tinto, V. 1982. Limits of theory and practice in student attrition. The Journal of Higher Education 53(6): 687-700. https://doi.org/10.2307/1981525        [ Links ]

Tinto, V. 2010. Higher education: Handbook of theory and research. (Vol. 25). https://doi.org/10.1007/978-90-481-8598-6        [ Links ]

Xiao, J. 2017. Learner-content interaction in distance education: The weakest link in interaction research. Distance Education 38(1): 123-135. https://doi.org/10.1080/01587919.2017.1298982        [ Links ]

Yasmin, D. 2013. Application of the classification tree model in predicting learner dropout behaviour in open and distance learning. Distance Education 34(2): 218-231. https://doi.org/10.1080/01587919.2013.793642        [ Links ]

Zuccaro, C. 2010. Statistical Alchemy - the misuse of factor scores in linear regression. International Journal of Market Research 52(4): 511-531. https://doi.org/10.2501/s1470785309201429        [ Links ]

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License