SciELO - Scientific Electronic Library Online

 
vol.39 issue3Decolonising continuing teacher professional development in the teaching of physical science through improvisation in rural areasDecoding information literacy ways of thinking in student learning: influencing pedagogic methods author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Journal

Article

Indicators

    Related links

    • On index processCited by Google
    • On index processSimilars in Google

    Share


    South African Journal of Higher Education

    On-line version ISSN 1753-5913

    S. Afr. J. High. Educ. vol.39 n.3 Stellenbosch Jul. 2025

    https://doi.org/10.20853/39-3-6272 

    GENERAL ARTICLES

     

    Opportunities and challenges of generative artificial intelligence supporting research in African classrooms

     

     

    F.S. ModibaI; A. Van den BergII; S MagoIII

    IDepartment of Development Studies, Nelson Mandela University, Gqeberha, South Africa. https://orcid.org/0000-000-6904-067X
    IIDepartment of Development Studies, Nelson Mandela University, Gqeberha, South Africa. https://orcid.org/0000-0002-4356-8865
    IIIDepartment of Development Studies, Nelson Mandela University, Gqeberha, South Africa. https://orcid.org/0000-0003-1459-3065

     

     


    ABSTRACT

    Generative artificial intelligence (AI) tools have sparked debates in the education sector prompting researchers to explore their desirability and potential in education. This paper acknowledges generative AI's potential to support the delivery of teaching, learning and research in the higher education emphasising its ability to improve student writing quality as well as academic productivity, success rate, and independence. However, responsible use of these AI tools to support research is also crucial. Furthermore, the challenges associate with AI tool use, especially accessibility and usage in the African context, are recognised. For instance, ethical challenges relating to (mis)use of AI because no or inadequate policy regulations have been implemented. In addition, there are technical and structural challenges relating to connectivity, power outages, device access and technical know-how. Therefore, this paper aims to identify the opportunities and challenges associated with using AI tools to support research in African Higher Education classrooms. For the study, a qualitative systematic literature review was applied to two articles using thematic analysis from a final selection of 29 articles. Findings indicated that generative AI tools could enhance student writing skills and increase productivity. Additionally, they could lead to research autonomy, improved writing proficiency, quality, and academic throughput. Shortcomings included AI misuse, knowledge deficiencies, and infrastructural challenges preventing AI access. Additionally, inadequate regulations relating to using generative AI tools for learning and teaching were a further challenge. It is essential to address the ethical concerns, invest in skills development and promote equitable digital access, especially in Africa, where this is limited. In addition, the capability approach revealed how the digital divide limited the adoption of generative AI tools in Africa.

    Keywords: Generative artificial intelligence, research support, teaching, Africa, capability approach.


     

     

    INTRODUCTION AND BACKGROUND

    The presence of generative artificial intelligence (AI) and its infiltration on academic realms has incited both fear and hope for learners and educators as more institutions rely on digital technologies for learning (Antonenko and Abramowitz 2023, 70; Shipepe et al. 2021, 1). These mixed feeling are due to Al's ability to offer tools for learning and teaching, while being cognisant that the evolving demands of education necessitate innovation and creativity in the learning process (Fitria 2021, 139). This is supported by Lv (2023, 207) who argues that the efficiency and quality of content production and dissemination can be improved through generative AI technologies like ChatGPT, DALL-E, DeepMind and Bard.

    Currently, generative AI technologies are being used in academic institutions for computing studies and across other disciplines. They are also valued for their use of natural language processing that offers increased accuracy and efficiency (Lv 2023, 208; Yang 2023, 232). As such, AI technologies are being used to support educators in their teaching and administrative roles. For example, less time is spent on complex administrative matters, which are resolved through Al's automation capabilities, thus, allowing academics to focus on student needs (Omorogiuwa, Ohiagu, and Lawal 2023, 5).

    Generative AI tools have been cited to assist with grading of assessments, tutoring activities, and student learning. In addition, they can provide formative and summative feedback to assessments while ensuring that in-depth reviews are provided (Hooda et al. 2022, 4). As a result, formative assessments receive timely feedback, which allows educators to monitor and design activities to support identified at-risk students to improve course content and material understanding (Hooda et al. 2022, 4; Yang 2023, 243). It is also argued that Al's grading capabilities are accurate and unbiased, ensuring constructive and critical feedback to students (Yang 2023, 243). Personalised learning is an AI tool's strength, supporting learners (Hooda et al. 2022, 6; Luckin and Cukurova 2019, 2830). Additionally, students can benefit from these AI technologies by using language support tools such as grammar, editing, speech and text translation, which improve communication difficulties. The tools further assist with detecting plagiarism, which is essential in developing student writing and research practices (Wang et al. 2023, 5).

    However, there are challenges related to structural and technical generative AI use. For example, Luckin and Cukurova (2019, 2836) argue that these tools can reinforce biases and perpetuate inequalities, particularly, if the data used to support the generative AI is biased or incomplete. Luckin and Cukurova (2019, 2836) also cite the potential for generative AI to replace human teachers as a challenge, which could negatively affect student learning and well-being. Additionally, there are concerns regarding ethical and responsible use of generative AI in education, particularly, data privacy and security as it is not clear how student data privacy and security are managed when using such tools in assessment and feedback (Hooda et al. 2022, 3). Scholars also emphasise the need for educators and various stakeholders to have a better understanding of generative AI and its effective and safe use (Singh 2023, 218; Walczak and Cellary 2023, 96).

    As there are ethical issues relating to generative AI use, there is a need to ensure that modern technologies are utilised responsibly by not creating harmful applications or digital divides (Hooda et al. 2022, 2). In addition, these tools require educators to develop new skills and knowledge to use AI effectively in their teaching and assessment practices (Antonenko and Abramowitz 2023, 12). The lack of bias, which is cited as the strength, is questioned by some scholars when it comes to user equality (Goldenthal et al. 2021, 5-6). For Hodgson et al. (2021, 1882), as generative AI technologies are created by humans; they can inadvertently have human biases and structural inequities built into them. Therefore, AI algorithms could potentially reinforce racial and gender stereotypes and inequalities (Dwivedi et al. 2023, 49).

    Other scholars have criticised the generative AI like ChatGPT for disempowering student learning abilities. Yilmaz and Yilmaz (2023, 6) identify that ChatGPT can provide faulty information, therefore, students should be required to verify the generated content. Moreover, Yilmaz and Yilmaz (2023, 6) emphasise the importance of evaluating the ethical appropriateness of data generated, which means students need to be mentored on generative AI tool use to support their learning activities (Qadir 2023, 1). Consequently, higher educational institutions (HEIs) should lead in generative AI use by capacitating educators and students to effectively integrate these practices into their teaching and learning contexts (Sandu, Karim and Kayastha 2021, 110).

    For Walczak and Cellary (2023, 73), there will always be a need to balance autonomous and self-organising AI systems as well as the planning and controlling roles of humans. As a result, future employees need to be able to work with and understand AI systems, which require new types of creativity, communication and digital skills. However, there are concerns that reliance on generative AI might cause users to lose knowledge and the ability to think logically (Yilmaz and Yilmaz 2023, 6). As a result, education systems need to facilitate generative AI adoption by building and developing student critical thinking and problem-solving skills (Dwivedi et al. 2023, 10).

    In addition, Gwagwa et al. (2020, 3) suggest that generative AI can support research by analysing large datasets and identifying patterns or correlations that may not be immediately apparent to human researchers. This functionality can help researchers make more accurate predictions, develop new treatments or interventions, and gain a deeper understanding of complex systems. Additionally, the tools can automate certain tasks, such as data entry or image analysis, saving researchers time and reducing the risk of error. Overall, generative AI can revolutionise the way in which research is conducted and accelerate the pace of scientific discovery.

    Generative AI has been adopted in some African classrooms to assist learning and as an instructional course. For example, a Namibian study by Shipepe et al. (2021, 1) showed how a course had been designed to include African indigenous knowledge as part of problem solving. While these studies illustrate how AI was embraced for course development to support learning and teaching, there is still insufficient publication evidence on its use to support research in the African context (Omorogiuwa et al. 2023, 14; Shipepe et al. 2021, 2).

    Cisse's (2018, 461) work emphasises that diversity plays a pivotal role in AI research. However, the bulk of AI expertise is concentrated in North America, Europe and Asia, with limited representation from Africa. This lack of diversity can perpetuate algorithmic biases and embed discrimination within AI products. Furthermore, a scarcity of African AI researchers translates to fewer opportunities to leverage generative AI to enhance the well-being of African communities. Consequently, broadening the geographical scope of AI research and fostering diversity within the field can serve as a vital strategy to mitigate these challenges and enhance AI research's overall quality and influence (Cisse 2018, 461) .

    However, the COVID-19 pandemic induced reliance on online teaching and learning, which provided a unique backdrop against which to examine generative AI tools in African classrooms. The rapid digitisation of education in 2020 created new opportunities and challenges and, in 2021, these efforts became more streamlined, and resulted in a more permanent shift to digitalisation classroom use. This transition made it essential to assess how AI technologies could be harnessed to support teaching, learning and research in this evolving landscape. Therefore, recent research conducted between 2021 and 2023 needs to be examined to provide a comprehensive understanding of the current state of generative AI usage in African classrooms and its implications for education in African contexts.

    While there are many studies considering the use of generative AI in education (Annamalai et al. 2023, 1; Yilmaz and Yilmaz, 2023, 2), papers that focus on their use in African classrooms are limited since this technology is still new in African contexts (UNESCO, Fengchun, and Wayne 2023, 34). Shipepe et al. (2021, 2) also indicate that there is limited expertise in teaching and designing courses on generative AI. Globally, the focus of AI in higher education has been mostly on computing studies, engineering and education (Antonenko and Abramowitz 2023, 66; Hooda et al. 2022, 9; Luckin and Cukurova 2019, 2825) indicating a need for more usage in other disciplines and studies exploring this usage.

    Thus, this paper contributes to the scientific body of knowledge in the social sciences as well as in business and economic sciences. This paper aims to identify the opportunities and challenges of using generative AI tools to support research teaching in African classrooms. To achieve the purpose, the following research objectives are outlined:

    To determine the contribution and value AI tools offer in supporting the delivery and teaching of research in African classrooms,

    To examine the limitations linked to the integration of generative AI tools within the African educational landscape,

    To employ an integrative approach that combines the Capability Approach (CA) and the Technology Adoption Model (TAM) to gain a comprehensive understanding of AI tools use in research-based teaching.

    By addressing these research objectives, this study aims to highlight both prospects and challenges of leveraging generative AI technology to enhance research instruction in African teaching and learning contexts.

     

    CAPABILITIES APPROACH AND TECHNOLOGY ACCEPTANCE MODEL (TAM)

    Technology adoption is itself a challenge, particularly, in contexts that are faced with a myriad of socio-economic issues. Therefore, the capabilities approach and technology acceptance model are used in this paper to promote the adoption of generative AI in higher education so that it does not add to existing problems experienced by educators and learners. The capability approach (CA) is concerned with cognitive and physical abilities to do certain things based on choices (Sen 1990, 320-321). These human capabilities include skills, dependent on other enablers such as resources (economic, infrastructural, and socio-political). Therefore, the resources can be enablers or impediments to people exercising their capabilities and choosing what is important to improve their lives. In this context, limited skills and resources can affect educators' and students' choices to exercise their capabilities, which limits their ability to achieve(Sen 1990, 320). The approach also recognises that although people have different starting points and resources, social arrangements and policies should expand their capabilities and opportunities (Gigler 2004, 6).

    The use of any technology depends on the user's acceptance or rejection. The technology acceptance model (TAM) is a widely accepted model that explains the predictor's technology user behaviour. It has also been used in different contexts where acceptance or rejection are anticipated (Granic and Marangunic 2019, 2573). For example, AI is viewed as a disruptor in all fields of study, including education. The TAM can be used to assess how computing technologies are viewed by intended users as it is based on determinants such as perceived usefulness and perceived ease of use, which influence certain technology's use (Davis 1989, 320; Murillo, Novoa-Hernández, and Rodríguez 2021, 618). For example, when a technology can help people perform better, it is likely to be accepted as it is considered useful.

    However, if the technology is not user friendly, its value in improving performance and usability will likely deter people from using it. As a result, usability and benefits influence attitudes towards accepting technologies (Davis 1989, 334; Murillo et al.2021, 618-619). It is also argued that external factors influence users' behaviour to accept or not accept a technological tool as noted in TAM 2 (Venkatesh and Davis 1996, 453). Therefore, the TAM 2 was adopted in this study as external determinants (for example, HEI's attitude towards AI, capabilities and infrastructure) shape people's reactions (students) towards the generative AI technology use.

     

    LITERATURE REVIEW

    AI use in various fields poses benefits and challenges that must be understood in African contexts if Africa is to invest in these technologies. Scholars have highlighted concerns related to economic inequalities, which could significantly impact the adoption and usage of AI technologies (Kopalle et al. 2022, 253). These advanced technologies might marginalise poor and disadvantaged people without access to the rapid technological advancements and innovations (Gwagwa et al. 2020, 1). Wealthy countries like Singapore who are leaders in AI have high gross domestic products (GDPs) compared to African countries with low GDPs, which influences their adoption and investments in these technologies (Kopalle et al. 2022, 253). Thus, economic inequality may act as either a global unifier or divider concerning the adoption and usage of AI technologies. Therefore, adoption of these technologies is impacted by economic issues, especially for the Global South. Similar challenges may manifest as challenges and opportunities in the African education sector.

    AI in Africa

    AI use in Africa is only in its early stages, which means investments in this technology as well as expediting other related infrastructure is important for successful implementation and usage. UNESCO et al. (2023, 32) initiated a pioneering programme to bolster the capabilities of all AI stakeholders in Africa to promote ethical solutions, cultivate digital skills and competencies, and enhance the ability of African nations to embrace and execute AI. This commitment is a reminder to African governments and academic institutions to prepare and harness AI technologies. For the Institute of Business and Science at the University of Pretoria (Schoeman et al. 2021, 2), universities and research institutions serve as the breeding grounds for AI ecosystems, providing a fertile environment for scientists to nurture their ideas and potentially launch new ventures. Further research suggests that for AI development and implementation, efforts should not be duplicated in African. Therefore, countries must work together and agree on standard guidelines for AI tool application and use in education sectors (Pillay 2018, 16). The collaborative efforts will save the continent resources while enabling countries with less technological resources to benefit from AI technology. UNESCO et al. (2023, 33) also add that it crucial for African countries to build resources together, by learning from each other and encouraging cross-border collaboration in building context relevant generative AI technologies.

    Opportunities of generative AI in African classrooms

    Information and communication technologies (ICTs) have significantly impacted contemporary society, altering how people think and communicate. Naturally, this transformation has extended its influence to education (Chen et al. 2020, 1). For example, automated systems are increasingly utilised in academia to assist in various tasks including generating ideas, composing texts, and aiding with technical coding assessments (Eke 2023, 3). Moreover, these systems possess the potential to significantly revolutionise the academic sphere by providing novel tools and capabilities for knowledge creation and dissemination.

    As a result, generative AI use can support students by providing ideas for their research topics and conducting their research (Eke 2023, 2). AIs provide a supporting environment during the research journey especially where supervisors do not have capacity to scaffold research candidates. While some studies criticise these tools for stifling creative thinking processes (Yilmaz and Yilmaz 2023, 6), others posit that AI tools help unlock creativity and, in the process, build students' critical thinking skills as they solve problems using these technologies (Walczak and Cellary 2023, 94).

    AI can also support teachers by providing language-translation services for learning materials, voice assistants, educational games, categorising digital books and offering specific-purpose AI applications like plugins to learning management environments to enhance or scaffold professional abilities (Chounta et al. 2022; Fitria 2021). However, it is important to ensure that teachers roles in the classroom are not diminished as they remain accountable for the outcomes of AI use.

    Challenges of generative AI in African classrooms

    Various challenges could affect generative AI use in African classrooms because of infrastructure, connectivity, and educator capabilities in harnessing AI.

    Limited infrastructure and connectivity

    Effective generative AI use is limited for educational institutions serving students from remote areas, particularly, because ICT infrastructure is a challenge in most African countries (Chiware and Becker 2018, 13). For example, studies reveal that during the restrictive level of COVID-19, some educators could only teach students using audio conferencing rather than other platforms that enabled content sharing such as MS Teams and Zoom. A Ghanaian study by Essel et al. (2021, 308) cited the effectiveness of audio teaching in delivering higher education classes during the pandemic. The study mentioned that students based in rural communities could not use video conferencing technologies owing to the poor connectivity in those areas. Even though this mode of teaching proved efficient, it posed a limitation for AI use since full access to the internet is required for programmes using this advanced technology.

    However, most African countries deal with poor infrastructure leading to lack of broadband to access the internet (UNESCO et al. 2023, 33). Areas that experience intermittent connectivity caused by unstable networks also pose a threat to AI use. Electricity is another major challenge that contributes to poor network connectivity, which add to challenges of fully experiencing generative AI on demand and anywhere (Eke, Wakunuma, and Akintoye 2023, 182). The potential benefits of AI in education, such as customised and self-paced remote learning, may not be accessible to economically disadvantaged nations owing to the presence of digital and economic divides (Kopalle et al. 2022, 523). Therefore, efforts to close these inequalities should be prioritised for quality education for all (Eke et al. 2023, 20).

    Educator training and capabilities building

    For Chounta et al. (2022, 00), implications for supporting teachers in effectively using AI as a tool in their practices include the need for professional development opportunities to enhance teacher skills, AI knowledge and the need to address privacy and ethical concerns related to AI use in education. There is also the need to balance the potential AI benefits with the risks of over-reliance on technology and the potential for learning being negatively impacted by reduced social interaction. However, for Kiemde and Kora (2022, 1), this can be managed by introducing ethics courses in academic training and capacity building of AI development actors through research on responsible AI tools in Africa. By diversifying AI teams and integrating African ethical values, education can facilitate the development of trustworthy generative AI in Africa and democratise these technologies (Kiemde and Kora 2022, 3).

    A study by Lopezosa et al. (2023, 1-12) delves into the multifaceted challenges associated with the use of artificial intelligence and its ethical implications. These challenges revolve primarily around issues of authorship, content and the delineation of AI's boundaries. Lopezosa et al. (2023, 1-12) emphasise that it is imperative for students to cultivate a comprehensive skill set that encompasses both technical proficiencies and a strong ethical foundation, enabling them to proficiently harness generative AI tools. However, Lopezosa et al. (2023) also highlight a core challenge, which involves the effective management of vast datasets, the identification of specific elements and variables within these datasets and humanising this data to construct compelling narratives. Additionally, the paper highlights the ongoing struggle of adapting to the constant emergence of new generative AI tools, posing an additional layer of complexity to the AI landscape.

    Sandu et al. (2021, 110) address the diverse array of challenges encountered by students utilising online learning platforms. These challenges encompass technical issues such as errors within the online system and the necessity for improved network capacity. There are also communication hurdles including difficulties experienced during online sessions, and a notable shortage of adequate learning resources.

    Strategies to harness the potential of generative AI in African classrooms

    Walczak and Cellary's (2023, 95) study highlights key strategies for universities to maximise the potential of AI in higher education including expanding conventional educational practices to foster intellectual abilities and communication skills while effectively integrating AI tools into educational processes. Additionally, universities must motivate students to develop skills aligned with dynamic job markets and equip them to engage in self-directed learning. A holistic education approach that cultivates creative thinking, empathy, cultural competence, and personal responsibility is essential for preparing future employees to thrive in a complex global society.

    Sanusi et al. (2022, 10) found there was a need to focus on five key AI education competencies in African secondary schools, namely, cultural competence, human-tool collaboration competence, self-learning competence, skill competence, and ethics competence. The study found that human-tool collaboration competence was the greatest, followed closely by ethics. These findings can be useful for AI education practices in other regions, as they highlight the importance of considering cultural and ethical factors while designing AI curricula. The study suggests that AI education should focus on technical skills and developing competencies related to human-machine interaction and ethical considerations. Additionally, the need for AI ethical curricula that are sensitive to the region's cultural and social norms was emphasised (Sanusi et al. 2022, 7).

    For Eke et al. (2023, 109), the challenges pose significant obstacles to integrating AI in African educational institutions. These hurdles include technical skill limitations, inadequate infrastructure and connectivity, user attitude and uncertainty, ethical concerns, insufficient supportive policies and infrastructure as well as insufficient structured data. Addressing these challenges is critical for realising the Al's potential in education and improving educational outcomes. Building technical skills and establishing an enabling environment for AI adoption is essential. However, proactive measures by governments and other stakeholders can help overcome these challenges and make AI more applicable in African classrooms.

    Policies and ethical considerations

    Hodgson et al. (2021, 1883) highlight that ethical implications, risks, and challenges of AI use in education has often not been examined. For example, concerns are raised regarding machine learning use for predictive risk modelling in areas such as child protection. In addition, Kiemde and Kora (2022, 4) argue that African ethical values can play an important role in the development of responsible AI in Africa. They emphasise that AI's ethical framework should not rely solely on a Western vision which may ignore the cultural values of marginalised communities, especially black people of African descent (Kiemde and Kora 2022, 4). Instead, African values that emphasise rationality and love for others can be incorporated into of AI technology creation. Therefore, by aligning AI frameworks with African values, the education sector can facilitate the development of responsible AI in Africa.

     

    RESEARCH METHODS

    The study used an explorative qualitative research design. A three-phase literature review was used to identify articles for review and analysis systematically. The initial research phase comprised two distinct steps. In step one, a randomised search in two publication databases was conducted (Charrois 2015, 145), and the Scopus database yielded 16 relevant articles. For step two, a similar randomised search was performed using the Science Direct database, which produced 46 additional articles. Consequently, a total of 62 articles were identified from these two steps.

    The inclusion and exclusion criteria in Table 1 were applied to filter these articles, and eliminated studies not aligned with the research's objectives.

    In phase two, 62 selected articles were scrutinised to identify those that strictly adhered to the study's inclusion criteria and primary goal. This examination led to the identification of nine articles that met the requirements.

    For phase three, a snowball sampling method was employed, utilising forward snowball sampling. This involved searching for the term "Africa" within the text and reference lists of the nine primary articles, with the condition that "Africa" had to be connected to the concept of education. This process yielded a total of 31 secondary articles; 27 were discovered within the primary articles' content, while four were found in their reference lists. Among these 31 articles, two were duplicates, resulting in a total of 29 secondary articles. However, upon data analysis, only two articles were identified as addressing the overall aim of the prospects and challenges of leveraging generative AI technology to enhance research instruction in African teaching and learning contexts. Throughout the entire process, from phase one to phase three, the research analysed a cumulative total of 37 articles. Figure 1 offers a flowchart of the processes followed in the three phases.

    Justification of the research investigation period and sample size

    The period between 2021 and 2023 was selected because over the past decade, AI has demonstrated its transformative potential in education. However, it is crucial to acknowledge the unprecedented disruptions caused by the COVID-19 pandemic, which compelled educational institutions worldwide, including those in Africa, to swiftly adopt online teaching modalities (Sandu et al. 2021, 113). In 2021, a more constructive use of online learning and exploring AI for learning and teaching occurred. This shift not only underscored the importance of digital tools and platforms in education but also highlighted the need for innovative solutions to ensure continuity in learning processes. In response to this crisis, educators, policymakers, and researchers began exploring AI-driven technologies to mitigate the challenges posed by the pandemic and improve the quality of online education.

    The study used systematic literate review (SLR) where an in-depth inquiry was conducted on two primary articles. There is no prescribed minimum number of studies for inclusion in a systematic literature review. The quantity of studies incorporated into a systematic literature review is primarily contingent upon the chosen research topic and supportive evidence accessible. A systematic literature review can be considered valid even if it encompasses no studies, as this may signify a dearth of research within the field and reveal knowledge gaps (DistillerSR 2023, 1). The decision to analyse two articles was based on lack of availability on the topic, and that SLR reviews can be conducted using a minimum of two articles to ensure comparison (Valentine, Pigott, and Rothstein 2010, 245).

     

    RESULTS AND DISCUSSIONS

    This study investigated the opportunities and challenges of generative AI supporting research, learning, and teaching in African classrooms. However, it is important to report that sourcing scientific articles on these challenges and benefits was difficult as search engines could not find sufficient literature on this topic. This finding aligns with UNESCO et al.'s (2023, 6) findings that there is a lack of information of how AI is being used in Africa curricula. However, it should be acknowledged that while UNESCO et al. (2023, 6) could not find the data for its mapping study, our findings identified at least three countries that had integrated AI in their curricula (Shipepe et al. 2021, 3), though not all in higher education. This scarcity of Al-related materials for African contexts presented a concern on the continent's active involvement in AI use in education contexts. However, this was a research gap that researchers and policymakers needed to address as the research's findings showed. Table 2 provides the summary of the research findings from the data analysis.

    From Table 2, the analysis and discussion of the results are presented focusing on AI use, AI challenges, AI ethics, and future AI considerations. The research by Sanusi et al. (2022, 3) revealed that learners could comprehend AI and machine learning, based on the material covered during the research study. The study's participants revealed a solid grasp of AI applications and a keen awareness of how they interacted with AI daily (Sanusi et al. 2022, 3).

    Academic support

    Studies found that the emergence of generative AI held prospects for students and academics. Sanusi et al,'s (2022, 7) findings pointed out that students could benefit through personalised learning and enriching their learning experiences. For example, students, in this study, recounted experiences with AI like using speech-based personal assistant services, recognising faces and tagging photos on social media. Moreover, the research referenced other studies that had designed game-based learning environments and cross-disciplinary AI programmes for K-12 students to gain practical experience with AI in addressing real-world challenges. Also, where learning challenges were experienced, the system could flag these and escalate them to teachers. Also, through identifying patterns and trends AI could analyse demographic data like gender distribution, age groups, grade levels, school settings and mobile device ownership.

    Post-academic

    Students could benefit from AI's presence and growth as that led to employment opportunities. As a result, the innovation involved in developing, updating and running these tools had influenced job creation. Other related sectors linked to products and services could also increase job prospects. These job prospects showed that if students were taught AI and understood how it worked then they would gain skills that would help them become employable or support entrepreneurial pursuits.

    Pillay (2018, 7) identified that academic administrative data could be analysed using AI. Moreover, AI could analyse student records and inform lecturers about at-risk students so that a remedial process could be started. Human capital in science, technology, engineering and mathematics was also identified as a challenge that could affect enrolment in AI-related studies, thus threatening Africa's capacity to innovate. Challenges noted included a lack of infrastructure, skills, and quality data AI systems used to process queries.

    Use of generative AI in African classrooms

    The results indicated some level of adoption of AI in African secondary schools, which was good progress towards utilisation and development of customised generative AI for African communities. Studies by Sanusi et al. (2022, 1-12) investigated the use of machine learning and pre-conceptions of teachers in teaching AI, which were instrumental in preparing the African education sector on what to expect as AI courses were rolled out to different levels of educations. Nevertheless, it is a concern that findings could not identify how generative AI was being used in institutions of higher learning, especially to support learning and teaching of research.

    Challenges of AI in African classrooms

    The main challenge that affects usage of ICTs in Africa is related to infrastructure. Infrastructure challenges have been an issue for decades (Nsolly and Ngo 2016, 95; Ponelis and Holmner 2015, 164) and further threaten those studying in African institutions by not providing the opportunity to use available learning technologies. In addition, generative AI tools are not accurate and might mislead students through generated content that might contain incorrect information (Eke 2023, 3; Walczak and Cellary 2023, 81). Systems like ChatGPT compose material without acknowledging original sources, thus, perpetuating the culture of plagiarism (Dwivedi et al. 2023, 16). Additionally, tools that provide citations like Grammarly Go and Jenni AI, tend to provide made-up sources. As a result, using sources without correct acknowledgement and verification will have a detrimental effect on academic integrity.

    The noted challenges of generative AI tools threatening the integrity of academic institutions by enabling students to plagiarise and produce content that spread misinformation shows that these tools require user involvement in the process of content generation (Eke 2023, 3; Dwivedi 2023, 26). However, academic integrity can be protected by teaching students how to work with these technologies to promote responsible use.

    Concerns were raised regarding the integration of AI in education. Critics argue that AI might replace human teachers, potentially diminishing the quality of education (Luckin and Cukurova 2019, 2836). Furthermore, there is apprehension about AI perpetuating biases and discrimination, especially if algorithms lack inclusivity and equity considerations. It is crucial to thoughtfully weigh the potential drawbacks of employing AI in education and to guarantee its ethical and responsible implementation.

    The issues of high levels of error prove that there is a need to educate and encourage users to actively engage with these tools collaboratively (Sanusi et al. 2022, 4), to produce quality work which further strengthens a case for HEIs to support generative AI. This weakness is a step towards educating academic users about responsible tool use as information is not perfect, thus, they need to work with the tool and not let the tool work for them.

     

    DISCUSSIONS BASED ON THE RESEARCH FRAMEWORKS

    The AI challenges noted have a potential influence on accepting and adopting generative AI technologies. For the TAM theory, a technology can be accepted by users if it is easy to use and is useful (Davis 1989, 320). However, technology needs to be accessed to experiment and come to a decision on its use. The capability and TAM framework in Figure 2 illustrate the important components that lead to generative AI acceptance in higher education. Technology knowledge is also important to influence decisions to test it. Therefore, for students to use generative AI, they need to have tool knowledge and access to test it. As evidenced in Sanusi et al.'s (2022, 7) case, the learners AI knowledge and machine learning increased after they were taught about these technologies. Also, student capabilities on using generative AI were dependent on educators' attitudes to the tool (Sanusi et al. 2022, 4), knowledge on tool use, and knowledge about the tool. Therefore, when students understand tool benefits and drawbacks, it is likely that they will use AI tools responsibly (Hooda et al. 2022, 3).

     

    FUTURE DIRECTIONS

    Given the infancy of generative AI in institutions of higher learning in Africa, there are limited publications on how these tools are being used in academic institutions. The acceptance of these technologies by universities is also key in harnessing the power of generative AI to support research, teaching, and learning. As indicated in Figure 2, adoption gaps and user capabilities can only be filled when there are standard guidelines that provide users know-how on generative AI tool use responsibly. The noted challenges of algorithms that are not responding with correct feedback (Walczak and Cellary 2023, 81) present an opportunity for African HEIs to work with generative AI institutions, and developers to improve the language models so that they fit academic needs of Global South contexts. As discussed in the section above, "Challenges of generative AI in African classrooms", instead of worrying about tracking AI texts (Eke 2023, 3), academic institutions should be developing AI tools that meet the diverse educational context within Africa. Furthermore, it discusses how Africa must be inspired to develop and customise generative AI tools and improve technological infrastructure.

     

    CONCLUSIONS AND RECOMMENDATIONS

    This study's results demonstrated the inequality and technology gap that exist between Africa and the Global North. The gap was illustrated by the limited use of these technologies in Africa's institutions of higher learning and the economic inequalities that render the continent lagging behind the Global North. However, these challenges of access and usage stem from infrastructural unreadiness and accepting technology and much needed investments. The study's results demonstrated a strong position that HEIs hold in turning the challenges to generative AI to innovations that should lead to the sector to accept and embrace technology so that students and researchers in African institutions can use generative AI responsibly under the universities' terms and regulations. These terms include upholding academic integrity by using generative AI responsibly, ethically and ensuring its equitable access. It is therefore recommended that policymakers and educational institutions prioritise developing a supportive environment for educators and students to explore these AI tools. This support can be achieved by having all stakeholders (government, HEIs, educators, and students) collaborate on developing AI guidelines and prioritising education and training to prepare everyone. Moreover, the government, HEIs, and the private sector should work together to address infrastructural limitations to enable fair access.

     

    DECLARATION OF INTEREST

    As the research does not involve human participants directly, no ethical approval from a research review committee was required. The authors declare that there is no known competing interests or personal relationships that could influence the paper.

     

    ACKNOWLEDGEMENTS

    Grammarly GO for language support and 8Bradley are acknowledged for this manuscript's professional editing.

     

    REFERENCES

    Annamalai, Nagaletchimee, Radzuwan Ab Rashid, Umair Munir Hashmi, Misrah Mohamed, Marwan Harb Alqaryouti, and Ala Eddin Sadeq. 2023. "Using chatbots for English language learning in higher education." Computers and Education: Artificial Intelligence 5: 100153. https://doi.org/https://doi.org/10.1016/j.caeai.2023.100153.         [ Links ]

    Antonenko, Pavlo, and Brian Abramowitz. 2023. "In-service teachers' (mis)conceptions of artificial intelligence in K-12 science education." Journal of Research on Technology in Education 55 (1): 64-78. https://doi.org/10.1080/15391523.2022.2119450.         [ Links ]

    Charrois, T. L. 2015. "Systematic reviews: what do you need to know to get started?" Can JHosp Pharm 68 (2): 144-8. https://doi.org/10.4212/cjhp.v68i2.1440.         [ Links ]

    Chen, Xieling, Haoran Xie, Di Zou, and Gwo-Jen Hwang. 2020. "Application and theory gaps during the rise of Artificial Intelligence in Education." Computers and Education: Artificial Intelligence 1: 100002. https://doi.0rg/https://doi.org/lO.lOl6/j.caeai.2020.l00002.         [ Links ]

    Chiware, Elisha RT, and Deborah Anne Becker. 2018. "Research data management services in southern Africa: a readiness survey of academic and research libraries." African Journal of Library, Archives and Information Science 28 (1): 1-16.         [ Links ]

    Chounta, Irene-Angelica, Emanuele Bardone, Aet Raudsep, and Margus Pedaste. 2022. "Exploring Teachers' Perceptions of Artificial Intelligence as a Tool to Support their Practice in Estonian K-12 Education." International Journal of Artificial Intelligence in Education 32 (3): 725-755. https://doi.org/10.1007/s40593-021-00243-5.         [ Links ]

    Cisse, Moustapha. 2018. "Look to Africa to advance artificial intelligence." Nature 562 (7728): 461462.         [ Links ]

    Davis, Fred D. 1989. "Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology." MIS Quarterly 13 (3): 319-340. https://doi.org/10.2307/249008.         [ Links ]

    DistillerSR. 2023. How many studies should be included in a systematic review. About Systematic Reviews.         [ Links ]

    Dwivedi, Yogesh K., Nir Kshetri, Laurie Hughes, Emma Louise Slade, Anand Jeyaraj, Arpan Kumar Kar, et al. Ryan Wright. 2023. "Opinion Paper: "So what if ChatGPT wrote it?" Multidisciplinary perspectives on opportunities, challenges and implications of generative conversational AI for research, practice and policy." International Journal of Information Management 71: 102642. https://doi.org/https://doi.org/10.1016/j.ijinfomgt.2023.102642.         [ Links ]

    Eke, Damian Okaibedi. 2023. "ChatGPT and the rise of generative AI: Threat to academic integrity?" Journal of Responsible Technology 13: 100060. https://doi.org/https://doi.org/10.1016/j.jrt.2023.100060.         [ Links ]

    Eke, Damian Okaibedi, Kutoma Wakunuma, and Simisola Akintoye. 2023. Responsible AI in Africa: Challenges and Opportunitieshttps://library.oapen.org/handle/20.500.12657/60787.         [ Links ]

    Essel, Harry Barton, Dimitrios Vlachopoulos, Dickson Adom, and Akosua Tachie-Menson. 2021. "Transforming higher education in Ghana in times of disruption: Flexible learning in rural communities with high latency internet connectivity." Journal of Enterprising Communities: People and Places in the Global Economy 15 (2): 296-312. https://doi.org/10.1108/JEC-08-2020-0151.         [ Links ]

    Fitria, Tira Nur. 2021. "Artificial intelligence (ai) in education: using ai tools for teaching and learning process." Prosiding Seminar Nasional & Call for Paper STIE AAS 4 (1): 134 - 147. https://prosiding.stie-aas.ac.id/index.php/prosenas/article/view/106.         [ Links ]

    Gigler, Bjorn-Soren. 2004. "Including the Excluded-Can ICTs empower poor communities? Towards an alternative evaluation framework based on the capability approach." 4th International conference on the capability approach.         [ Links ]

    Goldenthal, Emma, Jennifer Park, Sunny X. Liu, Hannah Mieczkowski, and Jeffrey T. Hancock. 2021. "Not All AI are Equal: Exploring the Accessibility of AI-Mediated Communication Technology." Computers in Human Behavior 125: 106975. https://doi.org/https://doi.org/10.1016/j.chb.2021.106975.         [ Links ]

    Granic, Andrina, and Nikola Marangunic. 2019. "Technology acceptance model in educational context: A systematic literature review." British Journal of Educational Technology 50 (5): 2572-2593.         [ Links ]

    Gwagwa, Authur, Erika Kraemer-Mbula, Nagla Rizk, Isaac Rutenberg, and Jeremy De Beer. 2020. "Artificial Intelligence (AI) Deployments in Africa: Benefits, Challenges and Policy Dimensions." The African Journal of Information and Communication (AJIC) (26). https://doi.org/10.23962/10539/30361.         [ Links ]

    Hodgson, David, Sophie Goldingay, Jennifer Boddy, Sharlene Nipperess, and Lynelle Watts. 2021. "Problematising Artificial Intelligence in Social Work Education: Challenges, Issues and Possibilities." The British Journal of Social Work 52 (4): 1878-1895. https://doi.org/10.1093/bjsw/bcab168.         [ Links ]

    Hooda, Monika, Chhavi Rana, Omdev Dahiya, Ali Rizwan, and Md Shamim Hossain. 2022. "Artificial Intelligence for Assessment and Feedback to Enhance Student Success in Higher Education." Mathematical Problems in Engineering 2022: 5215722. https://doi.org/10.1155/2022/5215722.         [ Links ]

    Kiemde, Sountongnoma Martial Anicet, and Ahmed Dooguy Kora. 2022. "Towards an ethics of AI in Africa: rule of education." AI and Ethics 2 (1): 35-40. https://doi.org/10.1007/s43681-021-00106-8.         [ Links ]

    Kopalle, Praveen K., Manish Gangwar, Andreas Kaplan, Divya Ramachandran, Werner Reinartz, and Aric Rindfleisch. 2022. "Examining artificial intelligence (AI) technologies in marketing via a global lens: Current trends and future research opportunities." International Journal of Research in Marketing 39 (2): 522-540. https://doi.org/https://doi.org/10.1016/j.ijresmar.2021.11.002.         [ Links ]

    Lopezosa, Carlos, Lluís Codina, Carles Pont-Sorribes, and Mari Vállez. 2023. "Use of generative artificial intelligence in the training of journalists: challenges, uses and training proposal." Profesional de la información 32 (4).         [ Links ]

    Luckin, Rosemary, and Mutlu Cukurova. 2019. "Designing educational technologies in the age of AI: A learning sciences-driven approach." British Journal of Educational Technology 50 (6): 2824-2838. https://doi.org/https://doi.org/10.1111/bjet.12861.         [ Links ]

    Lv, Zhihan. 2023. "Generative artificial intelligence in the metaverse era." Cognitive Robotics 3: 208217. https://doi.org/https://doi.org/10.1016/j.cogr.2023.06.001.         [ Links ]

    Murillo, Gabriel García, Pavel Novoa-Hernández, and Rocío Serrano Rodríguez. 2021. "Technology Acceptance Model and Moodle: A systematic mapping study." Information Development 37 (4): 617-632. https://doi.org/10.1177/0266666920959367.         [ Links ]

    Nsolly, Bert Ngajie, and Marie Charlotte Ngo. 2016. "Integration of ICTs into the curriculum of Cameroon primary and secondary schools: A review of current status, barriers and proposed strategies for effective Integration." International Journal of Education and Development using ICT 12 (1).         [ Links ]

    Omorogiuwa, O., K. Ohiagu, and K.H. Lawal. 2023. "Towards The Review of Artificial Intelligence Programme Curriculum and Effective Collaborations Among Academia for AI Programme Development in Africa." Journal of Digital Innovations & Contemporary Research in Science, Engineering & Technology 11 (1): 1-14. https://doi.org/dx.doi.org/10.22624/AIMS/DIGITAL/V11N1P1x.         [ Links ]

    Pillay, N. 18 November 2018 2018. Artificial intelligence for Africa: An opportunity for growth, development, and democratization. University of Pretoria (Pretoria). https://accesspartnership.com/artificial-intelligence-for-africa-an-opportunity-for-growth-development-and-democratisation/.         [ Links ]

    Ponelis, Shana R., and Marlene A. Holmner. 2015. "ICT in Africa: Building a Better Life for All." Information Technology for Development 21 (2): 163-177. https://doi.org/10.1080/02681102.2015.1010307.         [ Links ]

    Qadir, J. 2023. "Engineering Education in the Era of ChatGPT: Promise and Pitfalls of Generative AI for Education." 2023 IEEE Global Engineering Education Conference (EDUCON), 1-4 May 2023.         [ Links ]

    Sandu, Raj, Shakir Karim, and Mahesh Kayastha. 2021. "E-learning challenges using zoom and application of artificial intelligence to improve learning in australia higher education institutes." Proceedings of the Part of the Multi Conference on Computer Science and Information Systems, Virtual.         [ Links ]

    Sanusi, Ismaila Temitayo, Sunday Adewale Olaleye, Solomon Sunday Oyelere, and Raymond A. Dixon. 2022. "Investigating learners' competencies for artificial intelligence education in an African K-12 setting." Computers and Education Open 3: 1-12. https://doi.org/https://doi.org/10.1016/j.caeo.2022.100083.         [ Links ]

    Sanusi, Ismaila Temitayo, Solomon Sunday Oyelere, and Joseph Olamide Omidiora. 2022. "Exploring teachers' preconceptions of teaching machine learning in high school: A preliminary insight from Africa." Computers and Education Open 3: 100072. https://doi.org/https://doi.org/10.1016/j.caeo.2021.100072.         [ Links ]

    Schoeman, Willie, Rory Moore, Yusof Seedat, and Jeff Yu-Jen Chen. 2021. Artificial intelligence: Is South Africa ready? Gordon Institute of Business Science University of Pretoria (Pretoria). https://repository.up.ac.za/handle/2263/82719.         [ Links ]

    Sen, Amartya. 1990. "Development as capability expansion." The community development reader 41: 58.         [ Links ]

    Shipepe, A., L. Uwu-Khaeb, E. A. Kolog, M. Apiola, K. Mufeti, and E. Sutinen. 2021. "Towards the Fourth Industrial Revolution in Namibia: An Undergraduate AI Course Africanized." 2021 IEEE Frontiers in Education Conference (FIE), 13-16 Oct. 2021.         [ Links ]

    Singh, M. 2023. "Maintaining the Integrity of the South African University: The Impact of ChatGPT on Plagiarism and Scholarly Writing." South African Journal of Higher Education 37 (5): 203-220. https://doi.org/10.20853/37-5-5941.         [ Links ]

    UNESCO, Miao Fengchun, and Holmes Wayne. 2023. Guidance for generative AI in education and research. Notes: Includes bibliography. France: United Nations Educational, Scientific and Cultural Organisation.         [ Links ]

    Valentine, Jeffrey C., Therese D. Pigott, and Hannah R. Rothstein. 2010. "How Many Studies Do You Need?:A Primer on Statistical Power for Meta-Analysis." Journal of Educational and Behavioral Statistics 35 (2): 215-247. https://doi.org/10.3102/1076998609346961.         [ Links ]

    Venkatesh, Viswanath, and Fred D. Davis. 1996. "A model of the antecedents of perceived ease of use: Development and test." Decision sciences 27 (3): 451-481. https://doi.org/10.1111/j.1540-5915.1996.tb00860.x.         [ Links ]

    Walczak, Krzysztof, and Wojciech Cellary. 2023. "Challenges for higher education in the era of widespread access to Generative AI." Economics and Business Review 9 (2): 71-100. https://doi.org/https://doi.org/10.18559/ebr.2023.2.743.         [ Links ]

    Wang, Ting, Brady D. Lund, Agostino Marengo, Alessandro Pagano, Nishith Reddy Mannuru, Zoë A. Teel, and Jenny Pange. 2023. "Exploring the Potential Impact of Artificial Intelligence (AI) on International Students in Higher Education: Generative AI, Chatbots, Analytics, and International Student Success." Applied Sciences 13 (11): 6716. https://www.mdpi.com/2076-3417/13/11/6716.         [ Links ]

    Yang, Tzu-Chi. 2023. "Application of Artificial Intelligence Techniques in Analysis and Assessment of Digital Competence in University Courses." Educational Technology & Society 26 (1): 232-243. https://www.jstor.org/stable/48707979.         [ Links ]

    Yilmaz, Ramazan, and Fatma Gizem Karaoglan Yilmaz. 2023. "Augmented intelligence in programming learning: Examining student views on the use of ChatGPT for programming learning." Computers in Human Behavior: Artificial Humans 1 (2): 100005. https://doi.org/https://doi.org/10.1016/j.chbah.2023.100005.         [ Links ]