SciELO - Scientific Electronic Library Online

 
vol.38 número2A case for curriculum renewal: deficiencies in the training of prospective auditors in a technology era índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Journal of Higher Education

versão On-line ISSN 1753-5913

S. Afr. J. High. Educ. vol.38 no.2 Stellenbosch Abr. 2024

http://dx.doi.org/10.20853/38-2-6017 

GENERAL ARTICLES

 

Developing a customised learning design tool in support of curriculum design, professional development and institutional management within a South African Military education context

 

 

K. van der Merwe

Department of Educational Technology University of Stellenbosch Stellenbosch, South Africa

 

 


ABSTRACT

Learning design at tertiary level is a challenging and complex task with many aspects to take into consideration (Bennett, Lockyer, Agostinho 2018; Bower and Vlachopoulos 2018, 975; Bates, 2019). Learning design is identified as a core aspect of a tertiary educator's role, but often, little guidance is given to educators on this topic. Changes in the higher education landscape also bring about questions about ideal learning design. With changing times that require institutions to introduce other models like blended and online learning, more pedagogical guidance might be necessary to advise lecturers on best practices in terms of module design (Kebritchi, Lipschuetz, and Santiague 2017). To aid learning design thinking and to make the pedagogic structure of the design apparent, Laurillard and Ljubojevic (2011) and others have suggested the use of a learning design tool or aid. A learning design tool is a means which can provide analytical support for lecturers to evaluate their own practices (Bower et al. 2011). This study looks at the creation of a customised learning design tool (CLDT) and discusses whether it can serve as an Electronic Performance Support System, which is a tool that can guide and assist users in their roles in the workplace. This case study canvassed the experiences and opinions of faculty members in Military Education on the use of this CLDT. The tool is designed to capture and depict various features of a module's design. This study made use of design-based research which is a methodology requiring a phased approach and aims to influence practice. Inputs from the participants revealed the perceived value and benefits along with the necessary amendments needed for the tool.

Keywords: learning design, pedagogy, design-based research


 

 

BACKGROUND AND STATEMENT OF THE PROBLEM

Designing modules at tertiary level has become a complex task since there are many aspects to take into consideration (Bates 2019). Module design includes setting of outcomes, activities, and assessments whilst ensuring alignment and the optimal distribution of notional hours1 for each of these. Changes in the higher education landscape also bring about questions around ideal module design. There is a drive to adopt a diverse approach and incorporate innovative pedagogies. With changing times that require institutions to introduce other models like blended and online learning, more pedagogical oversight, and guidance, might be necessary to advise lecturers on the best practices in terms of module design (Kebritchi et al. 2017). Many lecturers inherit the designs of their modules from their predecessors and/or are guided largely by intuition or past experience when it comes to design. Documenting this design in detail is also often not common practice. Without having a means to quantify various aspects of their modules, few lecturers might have an in-depth view at a granular level as to how their module is structured. The learning design (LD) then remains implicit without a means to analyse it (Laurillard and Ljubojevic 2011).

Goodyear (2020) describes how normative models of design for learning have become inadequate for the use of hybrid learning. He states that "... people creating new spaces for hybrid learning are often doing so in ways that go beyond the capacities of existing design models" (Goodyear 2020, 1045). The introduction of different learning modalities (i.e., blended, hybrid and fully online) as mainstream has led those in education to reflect more deeply on the notion of "best practice" for LD. A key question, however, is what determines "best practice" for learning design for a given institutional culture or context, while still allowing for subject discipline nuances and educator autonomy (Kebritchi et al. 2017). Before establishing "best practice" though, one needs to establish the current practice. Much research, especially in Instructional Design (ID) has focused on how design should be; less research has focused on describing or analysing how design takes place (Ertmer, Parisio, and Wardak 2013, as cited by Muñoz-Cristóbal, Hernández-Leo, Carvalho et al. 2018).

Given the commitment to student development and pass rates and the proportion of time spent on teaching, it is worth institutions building capacity around learning design and becoming more design savvy in a way that is sustainable and pervasive (Bennett et al. 2018, 3). To aid learning design thinking and to make the pedagogic structure of the design apparent, Laurillard and Ljubojevic (2011) suggest the use of a learning design tool which can provide support for lecturers to evaluate their own practices. In the case of Laurillard, they recommend the Learning Designer (University College London) specifically. A few examples of features which can be measured using this tool include the level at which outcomes are pitched according to Bloom's Taxonomy, how much time is spent on which types of activities, whether tasks are online or not, whether they are synchronous or asynchronous, and ways of learning.

There is interest within the institution under study to conduct a curriculum renewal. There is also an interest to obtain greater insight into the current teaching practices and curriculum design2 and essentially better understand how notional hours are being used across modules in all disciplines and across modalities (i.e., online and contact classes). Although other learning design tools or aids exist, it was deemed necessary to create a customized tool for the faculty rather than use an existing one. This was for two reasons. Firstly, it allows for the inclusion of features unique to the environment and secondly it allows the faculty to access the data in a database of their own which is useful for collectively looking at faculty-based practices.

In the researcher's experience, even though there is much institutional support given to lecturers around pedagogical issues and curriculum planning, understanding what lecturers find most challenging about design is not entirely clear, even to lecturers themselves. If this can be established, better support can be provided. By introducing such a tool, one hopes to introduce a culture where design becomes a key and explicit focus area.

This study aimed to obtain perceptions of the value of the customized learning design tool within the faculty. Value was assessed from three perspectives, namely from a professional development perspective, a curriculum development perspective and a management perspective. Feedback will be used to develop the tool further so that it can serve as an Electronic Performance Support System (EPSS) that best serves its users and context.

 

RESEARCH OBJECTIVES

The objectives of the study are grouped into three categories and based on the facets of an EPSS as posited by Raybould (1990) and McKenny and Van den Akker (2005). They are as follows:

a) Basic EPSS design and pedagogical considerations

Identify the pedagogical considerations and design features for the CLDT.

b) Value for Professional Development

Establish the perceived value of the CLDT for individual lecturers.

Tailor the CLDT for the context.

c) Value for Management

Establish the perceived value of the CLDT for management and the faculty.

 

RESEARCH QUESTIONS

The overarching research question is as follows:

What is the perceived value of the CLDT, for the faculty in terms of a) professional development, b) curriculum development/renewal, and c) faculty management?

 

LITERATURE

Electronic Performance Support System

According to literature, a learning design tool could be considered an Electronic Performance Support System (EPSS). EPSSs are documented in both business and educational literature. See for example Brown (1996). The literature highlights various benefits of an EPSS ranging from analysis of data for managerial decisions to professional development as well as implementing theory to obtain educational outcomes. The concept of the EPSS was first coined by Gery (1991). What distinguishes EPSSs from other similar aids, and what is considered a measure of an effective EPSS according to Gery (1991, 24, as cited by McKenny and Van den Akker, 2005, 42), is the extent to which it integrates information, tools, and methodology for the user. This means that careful attention must be paid to its design to achieve this integration.

Raybould (1990, as cited by McKenny and Van den Akker 2005, 42) suggests three components for an EPSS. It should serve as (a) an advisory system, (b) an information base, and (c) document learning experiences. McKenny and Van den Akker (2005), posit four main elements as to what an EPSS should offer, (a) advice, (b) tools, (c) learning opportunities, and (d) communication aids. But it is noted that the balance between the elements that must be integrated is debated. Therefore, the literature advocates that every EPSS must serve the needs of its organisation or context. Bayram's (2004) work explores theoretical approaches to EPSSs that include the themes of collaboration, social context, performance appraisal and interactions in electronic learning.

Sezer (2021) provides a synopsis of recent literature outlining the criteria for the optimal design of an EPSS as well as its benefits. The criteria summarised from various sources include a simple design, suitability for a target audience, a system that is easy to update, provision of guidance for its use and integration with other sources of information (Sezer 2021, 89). The literature also reports positively on the impact or benefits of EPSSs to enhance user learning, knowledge and skills (Sezer 2021).

Nguyen and Hanzel (2007) notes three types of EPSSs. These three types indicate the complexity of the design of the EPSS. Nguyen and Hanzel (2007) describes the Minimal EPSS as a system which includes basic features that are static and not updated regularly. The mid-level EPSSs provide a guided step-by-step set of instructions, are easy to update, provide quick access to information and functional assistance. The high-level EPSS provides optimum support, makes use of artificial intelligence and allows for a sophisticated user-content interaction. The CLDT used in this study would classify as a mid-level EPSS.

The value of using a tool to aid learning design

There are a few types of learning design tools or planner tools in existence. Zalavra and Papanikolaou (2019, 108) list some examples of such tools, each with a slightly different focus on the design process. These examples include the CompendiumLD (Brasher et al. 2008), the Learning Designer (Laurillard and Ljubojevic 2011), the Cloudworks (Conole and Culver 2009), the LDTool (Agostinho 2011), CADMOS (Katsamani and Retalis 2013), and the WebCollage (Villasclaras-Fernández et al. 2013), the PeerLAND (Peer Assessment of LeArNing Designs), (Papanikolaou et al. 2016), the CuVIS (Banerjee and Murthy 2018), and the ILDE (Integrated Learning Design Environment) (Hernández-Leo et al. 2018). Not all these tools however are classified or portrayed as EPPSs but rather as support aids. Ugur-Erdogmus and Cagiltay (2019, 471) describe GAIDA (Guided Approach to Instructional Design Advising) (Spector and Whitehead 1994) as one of the earliest examples of an EPSS in education which supported courseware designers. Regardless of whether it is classified as an EPSS, studies on learning design tools report value in their use and/or perceptions of their use. These studies report that they hold value in terms of facilitating the learning design process, offering an opportunity to reflect on one's design and allowing for sharing of designs. If adopted, they have the potential to support teachers as designers and in addition, aid research on learning design (Laurillard et al. 2018; Zalavra and Papanikolaou 2019). The research of Laurillard et al. (2018) for example interrogated the value of their Learning Designer. Their study showed that evaluations from educators across institutions indicated their willingness to use the tool, valued it as a means to reflect on their pedagogy and also valued it as a platform to share and gain designs (Laurillard et al. 2018, 1056; Michos and Hernandez-Leo 2018, 255). A future phase of the CLDT project would look at the susceptibility of the faculty to sharing designs, as took place in the study done by Laurillard et al. (2018).

Literature on learning design however emphasizes the role of context (see for example Conole et al. 2008; Dagnino et al. 2018; Bennett et al. 2018; Koper 2006). Bonk and Wisher (2000, 67, as cited by Fisher, Perényi, and Birdthistle 2021, 16), state that it is important to recognise "inherent differences when attempting to transfer knowledge of learning innovations in one context to another, including 'learning culture, social interactions, motivational and affective factors'". What Cameron (2017) notes, which concurs with Smith and Brown (2005), is that one needs to create a culture receptive to exploration concerning learning design. Even though learning design is integral to what faculty members do daily, developing a learning design culture takes some deliberate initiative. Introducing a tool such as this, framed as an EPSS, could be a step in helping to develop this culture.

Developing a CLDT affords one access to one's own database. This means that various data can be looked at collectively and used to tell narratives, highlight trends and possibly ideal practices, learn from exceptional practices, draw conclusions, and make decisions. Like many industries, tertiary institutions are also harnessing the value of digital tools and data to help report on trends, modify current behaviour, compare it to other related data and use it to drive future decisions (Agasisti and Bowers 2017, 184). Typically, in the past, learning analytics has received more attention than design analytics but recently more research has taken place on design and the link between the two and what course design indicates about student performance. See for example work by Mangaroska and Giannakos (2018, 516), Schmitz et al. (2017) and Lockyer, Heathcote and Dawson (2013, 1439).

 

RESEARCH METHODOLOGY

Design research

Research on systems intended for a target audience often requires a phased and/or iterative approach to researching their usefulness. See for example the approach used by Laurillard et al (2018) and Ugur-Erdogmus and Cagiltay (2019). Design-based research (DBR), a methodology requiring a phased approach, was therefore selected for this study. DBR is described by Hoadley (2004, 203-204) as a complement to experimentation. He describes DBR as research which "views outcomes as the culmination of the interaction between designed interventions, human psychology, personal histories or experiences and local contexts" (2004, 204). It is also a research method which requires adjustments to interventions and measurements as the study progresses (Hoadley 2004, 204). DBR is therefore an ideal methodology for a research project such as this one which requires a phased approach and iterations along the way. Barab and Squire (2004, 8, as cited by Armstrong, Dopp and Welsh, 2020), state that the goal of DBR is to directly influence practice in addition to developing a theory that will be helpful to others.

Figure 1, adapted from Armstrong et al. (2020) shows the iterative process of DBR. The research process starts with the identification of the problem, the theory and context (as do most research projects) but then it undergoes cycles of analysis and exploration, moving to design and construction then evaluation and reflection, and essentially redesign, and then a new stage of analysis and exploration and the cycle is repeated.

 

 

An extract from the literature which best defines the intention of DBR and resonates well with this project and the role of the researcher in the FMS is the following:

"DBR is conducted by designers focused on (a) understanding contexts, (b) designing effective systems, and (c) making meaningful changes for the subjects of their studies" (Barab and Squire 2004, as cited by Armstrong et al. 2020).

DBR is said to have systemic validity, meaning that studies must inform theories which inform practice. Through this, one achieves methodological alignment (Hoadley 2004, 205). Hoadley stresses the importance that measurements, theories, treatments and interpretations must lead to usable knowledge. The data captured by the CLDT can provide such usable knowledge.

DBR is context-sensitive encouraging a "local science" (Hoadley 2004, 205). It also acknowledges the researcher as a participant-observer who deliberately intervenes in the study. This requires much introspection on the researcher's part. Hoadley further describes the researcher as "both a participant in a particular context and an agent for trying to generalize to other contexts." (2004, 211). Armstrong et al. (2020) also acknowledge the researcher as an agent of change in the research project.

Participants

Seven participants took part in this study. Admittedly a small sample size but this number constituted ten percent of the faculty. Participation in the study was voluntary. All participants were full-time lecturers with three of the participants in managerial roles at the time, either as Head of a Division or Department. Five were from Commerce related disciplines and two from the Humanities. Only one participant had had experience using a learning design tool in the past.

Method and data

The development of the CLDT drew initial inspiration from past experience as well as the Learning Designer (Laurillard et al. 2013) which is available for public use online. But several customisations and additional features were necessary to form the CLDT under study to meet the needs of the environment and context. A prototype of the CLDT was first developed in an Excel spreadsheet with the ultimate goal of developing it into an Application (App) linked to a database. But before doing so, it was essential that the prototype, namely the Excel spreadsheet, be thoroughly vetted and tested to fully understand the lecturers' needs and the tool's value. Participants were requested to use the CLDT to enter data on one of their modules. Graphic representation of the data within a dashboard provided a synopsis of various elements of the design for the lecturer.

The data was collected via a questionnaire and interviews to assess the user experience of the CLDT and its perceived usefulness and value.

Data analysis

The ratings assigned by the participants to different criteria are represented in graphs. The transcripts from the interviews were analysed using Atlas.ti which is software used for qualitative analysis. A thematic analysis was done to explore the salient themes. Comments were further divided into positive and negative sentiments to further analyse the perceived value of the tool.

 

ETHICAL CONSIDERATIONS

Ethical clearance was received from the relevant institutions. Participation was voluntary, and participants were free to withdraw from the study at any time. Participants remained anonymous throughout the study. Participants were given the option of reviewing a summary of the findings should they wish.

Researcher positionality may also be a factor in this case since the researcher works in an advisory capacity within the same environment as the participants. Researcher positionality has to do with a researcher's own interaction with the individuals and the environment under study. Each researcher has their own ideas, assumptions, frame of reference and beliefs and should be aware of how these may impact their research (Holmes 2020, 1). Despite the pitfalls concerning researcher positionality, there are benefits to the researcher being a part of the community under study. Having prior knowledge of an existing system allows for a more in-depth investigation and there is an already established relationship with the participants. Participants may feel they can speak more freely and delve into more detailed descriptions given that the researcher already understands the context (Holmes 2020, 6). It was emphasised to participants that this is a descriptive study aimed at understanding the value, usefulness and benefits of the tool and not an evaluation or critique of their opinions or practices.

 

RESULTS AND DISCUSSION

General experience with using the CLDT

The distribution of the number of comments across the nine interview questions can be seen in Figure 2. Most of the discussions took place on the general experience of using the CLDT, pedagogical approaches and the benefits of using the tool.

Overall, the general experience of the CLDT was positive as more positive sentiments were expressed than negative ones, indicating the potential of the tool.

Positive sentiments expressed pertained to reflection on the outcomes of the module/lessons and what the distribution looked like in terms of Bloom's Taxonomy. One such comment related to outcomes was,

"I can remember we went through this exercise years ago, when we did the programmes where we actually had to list these outcomes and stuff, but since then, never looked at it again" and another "I think it's it's [sic] interesting especially to think about the different tasks within outcomes".

Other positives included acknowledgement of its usefulness for analysing module design. One participant expressed,

"So for me, yes, it's a very useful tool I've learned a lot from it made me go look at a lot of different things again, and it will most probably influence how I will do things in the future." Another highlighted that initially it was challenging to use but once grasped, it was easy. "In the beginning, it was quite challenging I think, but but [sic] for me, once I understood how the system works, it was quite easy."

Others commented on how it assisted with planning:

"It helped me plan the lesson. And also let them also plan the lesson for me by doing the presentation, then instead of me". Another commented that "This forces you to actually go and sit down and take everything into account".

Another comment highlighted an element of flexibility to the tool.

"I liked that idea that you can set it up in the way you want to set it up like either per chapter or per lesson, per module. So that was nice". There was also mention of the overall picture created by the CLDT's summary of graphs, "Finally at the end to see the graphs that are drawn up, and you can kind of get a broad overview of what you're doing in your module".

Drawbacks to using the tool included the time-consuming nature of entering the data into the CLDT, especially initially while getting to grips with the tool.

"Tell people the advantages, like really, why is this good? Why would you want to use it? And in the long term, what's the benefit for you as an individual because it does take time to input all your data" and "It just takes you a little bit of time to just pick what is under the right categories."

Other drawbacks included feeling uncertain about what information was required where, "I didn't always know where to find all the information" and being unfamiliar with certain pedagogical terminology was also identified as an impediment to using the tool, "I had to actually go and Google to see what the heck does it mean".

There was one comment related to cautioning how the data would be interpreted by management, which would be a concern if the tool and data were used as part of an individual's performance management review. This is indeed something that would need to be handled carefully and in a suitably guided manner, should it be used in that way. But this is currently not the intention of the tool nor the institution's intention.

 

 

RATING THE IMPORTANCE OF MEASURING FEATURES AND RATING BENEFITS OF THE CLDT

Participants were asked to rate the importance of the CLDT's ability to measure 16 factors. They were asked to rate the factors on a scale of 1 to 5, with 5 indicating great importance. The results of this can be seen in Figure 4.

 

 

The averages show that the top five factors perceived as most valuable to measure include time spent on assessments, time spent on activities, assessment types, online learning design and the overall calculation of notional hours for the module. Those factors rated as less important to measure were threshold concepts, graduate attributes, critical-cross field outcomes, time spent on synchronous versus asynchronous activities and ways of learning.

Participants were also asked to rate the benefits of the CLDT for five entities. The results can be seen in Figure 5.

 

 

The numbers show that the greatest benefits of the CLDT were seen to be for lecturer's professional development and curriculum development. It was seen as least beneficial for the students. With the semi-structured interview, however, the majority of positive comments expressed pertained to the value it held for departments and faculty as a whole. See Figure 6 below. Here participants referred to data that could be pooled within a discipline to showcase the structure of the modules and gain some insight into typical practices. It was noted that such information would be useful for external evaluations.

 


Figure 6 - Click to enlarge

 

PEDAGOGY AND MODULE DESIGN

Predictably lecturers said that they employed multiple approaches based on the module's subject matter and the year level (first year versus third year or postgraduate). Key terms used to denote approaches were problem solving, interactive, application, continuous learning and critical thinking. The purpose of this question was to establish the sentiments behind their approaches. Upon full completion of entering a module's data, conceptions of how these pedagogical approaches play out could be compared.

When asked if they would design one module for both residential and distance students or two separate ones for each group, the majority (71%) said they would create one module for both groups.

 

 

The reason this question is pertinent is to try understand the current design practices and how to best design the tool to cater for preferences.

 

KEY INSIGHTS GAINED

Key insights that the participants gained after the exercise concerned the articulation and reevaluation of outcomes, an acknowledgement of a lack of variation of prescribed activity types, a questioning of constructive alignment between outcomes, activities and assessments and a questioning of the total number of notional hours covered. It is useful to know what these insights are as these can be key "selling" points of the tool going forward. Regarding outcomes, one lecturer commented that,

"I was thinking about this quite a lot. At the lesson level there's a lot of content in terms of trying to achieve the the [sic] outcome. Do all of the outcomes really contribute towards that? What do we need to cut here?" Another stated that, "I think that I also learned that the outcomes that I was setting were too general".

A further three commented on the amount of time they spent actively teaching which made them rethink how they can structure their class time to engage the students differently and shift from a didactic approach to a more interactive one. One lecturer commented that the exercise had revealed opportunities to "repackage" content:

"For instance, these, this section, it took me a week just to explain that concept and I found a video that actually, it's a five-minute video. And that got me using it because it's interesting." They went on to say that "... I realise but for the amount [sic] of hours that I need to spend, it's too much work. Yes, and it's too much. So, then I had to reduce the work that fits in with the hours but not just reducing the work to make it easier for the students, but also reducing the work that they can also be more adaptable and flexible."

Another comment indicated an acknowledgement of the distribution of notional hours:

"I'm astounded by how much time is spent on each of those activities. As a percentage of the whole 20% of the time, people are just giving back info 80% of the time. I mean, that tells you a lot about a module that you tell me, you know, my module is 90% just memorizing and 10% is a bit of creativity. I mean, it tells me immediately what you're busy with." Another commented that "I think my biggest shortcoming is to actually definitely measure things like the notional hours, and to more scientifically measure, for instance, the Bloom's Taxonomy at the various levels".

When asked if they would make use of the CLDT in the future, the majority, five out of the seven (71%) said yes, none said no and the remaining two (29%) were undecided. These results indicate that it is likely that that the majority of lecturers would find value in its use.

 

 

SUGGESTIONS FOR IMPROVEMENTS

Participants were asked if they had suggestions for improvements on the CLDT. There were 15 suggestions in total, with some overlaps between suggestions as seen in Figure 9.

 

 

Participants were asked in what ways the CLDT could be improved. Several suggestions included some sort of request for guidance. These included guidance on norms, terminology, use of the tool, and recommended preparation time for students regarding assessments.

The most frequent suggestion for improvement was for guidance on norms - what is recommended as "standard". One participant commented that

"There's no referencing to what are the standards? What is preferred? None of none of [sic] that is here. So before you then go and select those things, because you are assigning times and it would be good if I think it would be quite helpful. If there was just a place on here we could go to like a quick reference guide."

Although the purpose of the CLDT is to serve as a descriptive tool rather than a prescriptive one, the participants said they would appreciate some guidance as to what is typical or standard for certain features. For example, how much time should be factored in for certain activities at a given year level or how many outcomes are typically covered in a lesson. There was also a request for a built-in guide that describes a feature or provides a definition for a term if one requires it as seen in the following comment: "So, I think that there needs to be some sort of guide to say this is what this actually means. In case I've got it wrong."

Another request was to include a student analytic dimension to the tool. If the tool could be linked to student feedback and their grades for a corresponding module design, lecturers would find it useful to assess the extent to which the module's design was appreciated by students and/or resulted in higher grades.

"What will be nice is, I don't know if the system can do it, but to assess results of students, let's say from this year comparing to last year, so yeah, so that would actually also be quite, quite interesting." Also on the topic of analytics, another comment was "Maybe if, if [sic] the student feedback on the module could be linked in some way to this measurement instrument, because I'm setting all these things up, and I'm trying to cover Blooms and everyone else, and putting in all these hours to make sure the notional hours are addressed, etc, etc. But students experience that ... Yeah. And the student is the client at the end of the day as well as the organisation."

One other comment worth noting is a remark by a participant for clarity on what a "military contextualized environment" is. It is worth exploring how or to what extent a tool such as this needs to be contextualized for its educational environment and what that means. These suggestions can help in shaping the tool and accompanying guidelines for future use.

 

CONCLUSION

This study showed that there is perceived value and interest in the trialled CLDT and that it is viable to pursue its use in the context under study. Sentiments amongst the seven participants, overall, were positive for its use in terms of professional development and curricula development but it was perceived as especially beneficial for departments and the faculty. This would be the first time that such a digital tool and the resulting data could be used as part of the faculty's curriculum renewal process. The tool can be a means to document current learning designs within the faculty and raise discussions on different approaches, particularly within the same subject disciplines.

From a methodological point of view, design-based research was a suitable method to employ given the context. Participants played a key role in helping to shape the tool. In order to fully meet the criteria of the EPSS, tool tips and pedagogical guidance should be built into the system. The mostly affirming feedback provided by the participants laid the groundwork for a follow-up study on the use and further development of the CLDT within this environment. The second phase of the study involves transforming the Excel spreadsheet prototype into an Application (App) on a database to evaluate it further.

From a theoretical standpoint, a project such as this can drive theory and everyday practice closer together. Studies concerning motivations for CLDT practices are comprehensive, but the evidence linking them or rather the manifestation of design is not evident and therefore illusive. In other words, using such a tool can likely help show how learning and teaching theory relates to actual design.

The ultimate measure of the usefulness of the CLTD is to observe if it can yield insights that guide changes in practices, in other words, will the data initiate data-driven decision-making. This would require a longitudinal study which will be done as part of a follow-up study.

 

NOTES

1 . Notional hours is the average time it takes an average learner to complete a module (National Department of Education (1997, 16). One credit equates to 10 hours of learning (National Department of Education, September (1998, 179).

2 . Curriculum development/renewal here includes learning design.

 

REFERENCES

Agasisti, Tommaso and Alex J. Bowers. 2017. "Data analytics and decision making in education: towards the educational data scientist as a key actor in schools and higher education institutions." In Handbook of contemporary education economics, 184-210. Edward Elgar Publishing,         [ Links ]

Agostinho, Shirley. 2011. "The use of a visual learning design representation to support the design process of teaching in higher education." Australasian Journal of Educational Technology 27(6). https://doi.org/10.14742/ajet.923.         [ Links ]

Armstrong, M., C. Dopp, and J. Welsh. 2020. "Design-Based Research, The Students' Guide to Learning Design and Research." https://edtechbooks.org/studentguide/design-based_research.         [ Links ]

Banerjee, Gargi and Sahana Murthy. 2018. "CuVIS: An interactive tool for instructors to create effective customized learning designs with visualizations." Australasian Journal of educational technology 34(2). https://doi.org/10.14742/ajet.3773.         [ Links ]

Bates, Tony. 2019. "Why university lecturers need a teaching certificate. University World News - The Window on Higher Education." https://www.universityworldnews.com/post.php?story=2019112809050642.         [ Links ]

Bayram, Servet. 2004. "Revisioning theoretical framework of electronic performance support systems (EPSS) within the software application examples." Turkish Online Journal of Distance Education 5(2). https://dergipark.org.tr/tr/download/article-file/156532.         [ Links ]

Bennett, Sue, Lori Lockyer, and Shirley Agostinho. 2018. "Towards sustainable technology-enhanced innovation in higher education: Advancing learning design by understanding and supporting teacher design practice." British Journal of Educational Technology 49(6): 1014-1026. https://doi.org/10.1111/bjet.12683.         [ Links ]

Bower, Matt and Panos Vlachopoulos. 2018. "A critical analysis of technology-enhanced learning design frameworks." British Journal of Educational Technology 49(6): 981-997. https://doi.org/10.1111/bjet.12668.         [ Links ]

Bower, Matt, Brock Craft, Diana Laurillard, and Liz Masterman. 2011. "Using the Learning Designer to develop a conceptual framework for linking learning design tools and systems." In Proceedings of the 6th International LAMS & Learning Design Conference, 61-71.         [ Links ]

Brasher, Andrew, Gráinne Conole, Simon Cross, Martin Weller, Paul Clark, and Juliette White. 2008. "CompendiumLD - a tool for effective, efficient and creative learning design." http://lams2008.lamsfoundation.org/proceedings.htm.         [ Links ]

Brown, Lesley A. 1996. Designing and developing electronic performance support systems. Digital Press.         [ Links ]

Cameron, Leanne. 2017. "How learning designs, teaching methods and activities differ by discipline in Australian universities." Journal of Learning Design 10(2017): 69-84. https://researchonline.jcu.edu.au/60903/6/60903.pdf.         [ Links ]

Conole, Gráinne and Juliette Culver. 2009. "Cloudworks: Social networking for learning design." 2009. Australasian Journal of Educational Technology 25(5). https://doi.org/10.14742/ajet.1120.         [ Links ]

Conole, Gráinne, Andrew Brasher, Simon Cross, Martin Weller, Paul Clark, and Juliette Culver. 2008. "Visualising learning design to foster and support good practice and creativity." Educational Media International 45(3): 177-194. https://doi.org/10.1080/09523980802284168.         [ Links ]

Conole, Gráinne, Maarten De Laat, Teresa Dillon, and Jonathan Darby. 2008. "'Disruptive technologies', 'pedagogical innovation': What's new? Findings from an in-depth study of students' use and perception of technology." Computers & Education 50(2): 511-524. 10.1016/j.compedu.2007.09.009.         [ Links ]

Dagnino, Francesca Maria, Yannis A. Dimitriadis, Francesca Pozzi, Juan I. Asensio-Pérez, and Bartolomé Rubia-Avi. 2018. "Exploring teachers' needs and the existing barriers to the adoption of Learning Design methods and tools: A literature survey." British Journal of Educational Technology 49(6): 998-1013. https://doi.org/10.1111/bjet.12695.         [ Links ]

Fisher, Rosemary, Aron Perényi, and Naomi Birdthistle. 2021. "The positive relationship between flipped and blended learning and student engagement, performance and satisfaction." Active Learning in Higher Education 22(2): 97-113.         [ Links ]

Gery, Gloria J. 1991. Electronic performance support systems: How and why to remake the workplace through the strategic application of technology. Weingarten Publications, Inc.         [ Links ]

Goodyear, Peter. 2020. "Design and co-configuration for hybrid learning: Theorising the practices of learning space design." British Journal of Educational Technology 51(4): 1045-1060.10.1111/bjet.12925.         [ Links ]

Hernández-Leo, Davinia, Juan I. Asensio-Pérez, Michael Derntl, Francesca Pozzi, Jonathan Chacón, Luis P. Prieto, and Donatella Persico. 2018. "An integrated environment for learning design." Frontiers in ICT5: 9. https://doi.org/10.3389/fict.2018.00009.         [ Links ]

Hoadley, Christopher M. 2004. "Methodological alignment in design-based research." Educational Psychologist 39(4): 203-212. 10.1207/s15326985ep3904_2.         [ Links ]

Holmes, Andrew. 2020. "Researcher Positionality - A Consideration of Its Influence and Place in Qualitative Research - A New Researcher Guide." Shanlax International Journal of Education 8(4): 1-10. doi:org/10.34293/education.v8i4.3232.         [ Links ]

Katsamani, Mary and Symeon Retalis. 2013. "Orchestrating learning activities using the CADMOS learning design tool." Research in Learning Technology 21(2013). 10.3402/rlt.v21i0.18051.         [ Links ]

Kebritchi, Mansureh, Angie Lipschuetz, and Lilia Santiague. 2017. "Issues and challenges for teaching successful online courses in higher education: A literature review." Journal of Educational Technology Systems 46(1): 4-29. https://doi.org/10.1177/0047239516661713.         [ Links ]

Koper, Rob. 2006. "Current research in learning design." Journal of Educational Technology & Society 9(1): 13-22.         [ Links ]

Laurillard, Diana and Dejan Ljubojevic. 2011. "Evaluating learning designs through the formal representation of pedagogical patterns." In Investigations of e-learning patterns: Context factors, problems and solutions, 86-105. IGI Global.         [ Links ]

Laurillard, Diana, Patricia Charlton, Brock Craft, Dionisios Dimakopoulos, Dejan Ljubojevic, George Magoulas, Elizabeth Masterman, Roser Pujadas, Edgar A. Whitley, and Kim Whittlestone. 2013. "A constructionist learning environment for teachers to model learning designs." Journal of Computer Assisted Learning 29(1): 15-30. https://doi.org/10.1145/3328778.3366927.         [ Links ]

Laurillard, Diana, Eileen Kennedy, Patricia Charlton, Joanna Wild, and Dionisis Dimakopoulos. 2018. "Using technology to develop teachers as designers of TEL: Evaluating the learning designer." British Journal of Educational Technology 49(6): 1044-1058. doi:org/10.1111/bjet.12697.         [ Links ]

Learning Designer. UCL Knowledge Lab, UCL Institute of Education. https://www.ucl.ac.uk/learning-designer/.         [ Links ]

Lockyer, Lori, Elizabeth Heathcote, and Shane Dawson. 2013. "Informing pedagogical action: Aligning learning analytics with learning design." American Behavioral Scientist 57(10): 1439-1459.10.1177/0002764213479367.         [ Links ]

Mangaroska, Katerina and Michail Giannakos. 2018. "Learning analytics for learning design: A systematic literature review of analytics-driven design to enhance learning." IEEE Transactions on Learning Technologies 12(4): 516-534.         [ Links ]

McKenney, Susan and Jan van den Akker. 2005. "Computer-based support for curriculum designers: A case of developmental research." ETR&D 53: 41-66. https://doi.org/10.1007/BF02504865.         [ Links ]

Michos, Konstantinos and Davinia Hernández-Leo. 2018. "Supporting awareness in communities of learning design practice." Computers in Human Behavior 85(2018): 255-270. https://doi.org/10.1016/j.chb.2018.04.008.         [ Links ]

Muñoz-Cristóbal, Juan A., Davinia Hernández-Leo, Lucila Carvalho, Roberto Martinez-Maldonado, Kate Thompson, Dewa Wardak, and Peter Goodyear. 2018. "4FAD: A framework for mapping the evolution of artefacts in the learning design process." Australasian Journal of Educational Technology 34(2). https://doi.org/10.14742/ajet.3706.         [ Links ]

Nguyen, Frank and Matthew Hanzel. 2007. "LO+ EPSS= just-in-time reuse of content to support employee performance." Performance Improvement 46(6): 8-14.         [ Links ]

Papanikolaou, Kyparisia A., Evagellia Gouli, Katerina Makrh, Ioannis Sofos, and Maria Tzelepi. 2016. "A peer evaluation tool of learning designs." In Adaptive and Adaptable Learning: 11th European Conference on Technology Enhanced Learning, EC-TEL 2016, Lyon, France, September 13-16, 2016, Proceedings 11, 193-206. Springer International Publishing.         [ Links ]

Raybould, Barry. 1990. "Solving human performance problems with computers." Performance & Instruction 29(11): 4-14. https://www.academia.edu/19327941/The_design_of_performance_support_systems_to_contextualise_ generic_learning_designs.         [ Links ]

Schmitz, Marcel, Evelien Van Limbeek, Wolfgang Greller, Peter Sloep, and Hendrik Drachsler. 2017. "Opportunities and challenges in using learning analytics in learning design." In Data Driven Approaches in Digital Education: 12th European Conference on Technology Enhanced Learning, EC-TEL 2017, Tallinn, Estonia, September 12-15, 2017, Proceedings 12, 209-223. Springer International Publishing.         [ Links ]

Sezer, Baris. 2021. "Developing and investigating an electronic performance support system (EPSS) for academic performance." Australasian Journal of Educational Technology 37(6): 88-101. https://doi.org/10.14742/ajet.6121.         [ Links ]

Smith, Judith and Alison Brown. 2005. "Building a culture of learning design: Reconsidering the place of online learning in the tertiary curriculum." In ASCILITE 2005 Balance, fidelity, mobility: Maintaining the momentum?, 615-623.         [ Links ]

Spector, Michael J. and Larry K. Whitehead. 1994. "A Guided Approach to Instructional Design Advising." Paper presented at the International Conference of the Association for the Development of Computer-Based Instructional Systems (ADCIS) (35th, Nashville, TN, February 15-19).         [ Links ]

Ugur-Erdogmus, Feray and Kursat Cagiltay. 2019. "Making novice instructional designers expert: Design and development of an electronic performance support system." Innovations in Education and Teaching International 56(4): 470-480. https://doi.org/10.1080/14703297.2018.1453853.         [ Links ]

Villasclaras-Fernández, Eloy, Davinia Hernández-Leo, Juan I. Asensio-Pérez, and Yannis Dimitriadis. 2013. "Web Collage: An implementation of support for assessment design in CSCL macro-scripts." Computers & Education 67(2013): 79-97. 10.1016/j.compedu.2013.03.002.         [ Links ]

Zalavra, Eleni and Kyparisia Papanikolaou. 2019. "Exploring the potential of the learning designer as a teacher support tool." Electronic Journal of e-Learning 17(2): 107-117. 10.34190/JEL.17.2.04.         [ Links ]

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons