SciELO - Scientific Electronic Library Online

 
 issue22 author indexsubject indexarticles search
Home Pagealphabetic serial listing  

Services on Demand

Article

Indicators

Related links

  • On index processCited by Google
  • On index processSimilars in Google

Share


Yesterday and Today

On-line version ISSN 2309-9003
Print version ISSN 2223-0386

Y&T  n.22 Vanderbijlpark  2019

http://dx.doi.org/10.17159/2223-0386/2019/n22a5 

ARTICLES

 

Taking the sting out of assessment: The experiences of trainee teachers experimenting with innovative alternative performance assessment in the History classroom

 

 

Pieter WarnichI; Henriëtte LubbeII

INorth-West University (Potchefstroom) pieter.warnich@nwu.ac.za Orcid no. 0000-0003-3967-7767
IIUniversity of South Africa lubbehj@unisa.ac.za Orcid no. 0000-0002-4458-8016

 

 


ABSTRACT

This article explores the experiences of History and Social Sciences (History) trainee teachers (n=33) and their learners during the implementation of five versatile and innovative alternative performance assessment strategies in their diverse classroom settings during their practicum at schools. Originally designed for the corporate staff training environment, and subsequently utilised as community building and data collection techniques in a participative community-engaged research project, these five interactive activities were adapted to act as innovative teaching and alternative formative performance assessment strategies in the History classroom, the latter of which is the main focus of this article. The article is anchored in a social constructivist and dialogic theoretical framework and argues that alternative performance assessment techniques that are non-graded, interactive, formative and dialogic in nature, take place within an atmosphere of emotional safety, and integrate a strong element of enjoyment, are able to remove the anxiety that often characterises both summative and graded formative assessment. This, in turn, makes learners more receptive to learning and brings History to life in the classroom. In an attempt to answer two interrelated research questions: "how did trainee History teachers experience the implementation of innovative alternative performance assessment strategies", and "how did they perceive the response of the learners to a fresh approach to formative assessment", the article employs a qualitative research methodology which rests on research findings generated through the use of data gathered from written, visual and oral feedback from the participants during and after a practical workshop which prepared them for the implementation phase of the study. The research findings suggest, inter alia, that both the trainee teachers and their learners enjoyed a fresh, non-threatening approach to formative assessment and that the learners participated freely and enthusiastically in groups when implementing these formative assessment strategies.
The findings also indicate some challenges including time management, classroom management, and appropriate facilitation skills in managing more advanced learners who, it was found, tended to overpower less confident learners in their groups. It finally offers recommendations for improvement should History teachers prefer to implement these alternative performance assessment strategies in their classrooms.

Keywords: Alternative formative assessment; Assessment strategies; History education; Trainee teachers; History learners.


 

 

Introduction

Assessment is an essential teaching and learning activity that forms an integral part of all History teachers' lives. Assessment involves a continuous and planned process of collecting, analysing, interpreting and recording information about learner performance in order to make quantitative and qualitative judgements about what the learners have learned (Hamidi, 2010). It could either be summative (assessment of learning) or formative (assessment for learning) in nature and should ideally include both forms of assessment working in close harmony with each other (Lau, 2016).

Learners construct knowledge in various ways. It is therefore necessary for the History teacher to apply different approaches to assessment (Stears & Gopal, 2010) and be sensitive to learners' cultural contexts (Lee Hang & Bell, 2015). However, the manner in which History teachers conceive of assessment strategies is subjective and unique. Within a South African context, the choice of assessment strategy is further largely controlled by the directives stated in the CAPS1 document (Department of Basic Education (DBE), 2011). In the end the History teachers' approach to assessment in general depends on the teachers' working-knowledge, choice and application of the various kinds of assessment strategies that will be implemented in their classes (Furtak, Kiemer, Circi, Swanson, De León, Morrison & Heredia, 2016; Carless, 2015). Despite the global call for a more holistic assessment approach and for traditional assessments to be complemented by collaborative and participatory learner-centred alternative performance assessment activities' (e.g. McCurdy, Reagan, Rogers, & Schram, 2018; Stosich, Snyder & Wilczak, 2018; Haun, 2018; Frunza, 2014; Duncan & Buskirk-Cohen, 2011; McMillan & Hearn, 2008) research within the South African and African (Perry, 2013) context still shows a strong tendency among History teachers to give preference to traditional teacher-centred instruction and summative assessment methods and practices (Bunt, 2013; Moreeng & Du Toit, 2013; Warnich & Meyer, 2013). These attitudes exist despite the National Curriculum's call "for an active and critical approach" (DBE, 2011:4) to teaching, learning and assessment practices where not only content knowledge will be assessed, but the focus will also be on the demonstration of critical thinking and creative problem-solving skills -competencies and attitudes which will enable the learners to take their place in society as mature citizens (DBE, 2011).

The History teacher should guard against giving preference to certain assessment strategies based on his/her preconceived opinions and experiences, believing them to be intrinsically better than others. The choice of assessment strategies should always be in accordance with the desired learning objectives to be measured. When the focus is on what learners need to know, teachers can collect assessment data during formal and informal assessment opportunities in numerous ways and from multiple sources in order to gauge progress. By exploiting a repertoire of innovative alternative assessment strategies, the teacher will ensure that provision is made for the diverse learning needs and styles of the individual learners (Janisch, Liu & Akrofi, 2007). According to Offerdahl and Tomanek (2011) it is those teachers who are more sophisticated in the way they think about assessment who will be more willing to experiment with new and more flexible assessment strategies that can be used for different purposes.

By applying innovative and alternative performance assessment practices it further opens the possibility of bringing enjoyment into the classroom that can help to alleviate learner stress and enhance learner engagement. In its nature assessment implies judging and being judged which is the reason why some learners find formal summative (as opposed to non-graded formative) assessment intimidating (Von der Embse & Hasson, 2012). It triggers anxiety as this type of assessment focuses primarily on tests and examinations of which the results are mainly used to measure learner performance against an expected outcome and, in some cases, weigh learners against one another in terms of personal performance (Bartlett, 2015). Test anxiety is a psychological condition where learners experience distress and anxiety during testing or evaluating situations which in the end can impair academic performance. This may be the result of external pressure from parents, schools or peers, or internal pressure as a result of internalised expectations set by the learners themselves to perform well (Zhao, Selman, & Haste, 2015; McDonald, 2001). In addition, research suggests that the frequency of fear of assessment in school learners has increased in all age groups over time (Hesketh, Zhen, Lu, Dong, Jun & Xing, 2010; McDonald, 2001). According to Von der Embse and Hasson (2012:181) "test anxiety is considered one of the most disruptive factors in test performance". The research literature is further in agreement that the creation of a positive environment in which the assessment takes place is of utmost importance as it will reduce learner anxiety and will in the end ensure that the potential of all learners will be realised (Von der Embse & Hasson, 2012).

 

Research aim

The aim of this article is to reflect on the practical implementation of innovative alternative performance assessment strategies, designed to be enjoyable and non-threatening, in the History classroom by trainee teachers, and to explore their perception of how these strategies influenced both their own teaching experience and the attitude and learning behaviour of the learners. Within the context of this study, learning behaviour refers to the behaviour demonstrated by the learners during their interaction with their peers and the trainee teacher when innovative and alternative interactive performance assessment strategies were implemented in class.

 

Performance assessment conceptualized

Since the late 1980s performance assessment as an alternative type of formative assessment has increasingly been drawing more attention in the literature whenever classroom-assessment was at issue (Killen, 2007; Moskal, 2003; Wilson & Wineburg,1993). A reason for this growing interest can be ascribed to its potential to improve student learning and achievement in developing critical abilities such as critical thinking, inquiry, communication and collaboration (Cimer, 2018; Stosich et al., 2018; Kubiszyn & Borich, 2010). In most cases the development of these abilities is poorly measured during traditional (summative) assessment practices where the emphasis is on the memorising of content knowledge. The result is an ongoing recognition of a need for a broader array of History formative assessment strategies in the development of historical concepts and skills where test results and grading are not heightened to improve performance (Samuelsson, 2018; Demircioglu, 2010; Edmunds, 2006).

The conceptualisation of performance assessment varies widely and has different meanings, both in focus and in interpretation. Although there is no clear consensus on the exact meaning of "performance" (Palm, 2008), it suggests an assessment strategy and practice that value the application of deeper conceptual understanding and transferable skills over lower level content acquisition by means of rote learning (Stosich et ah, 2018; Vander Ark, 2013). McMillan (2004) describes classroom-based performance assessment as a type of assessment where the teacher observes the learners and uses specific criteria to judge their ability to demonstrate a skill or proficiency when creating a product, constructing a response, or making a presentation. The emphasis is not only on the assessment of the performance of a meaningful task (the product), which often involves realtime applications, but also on the particular method (the process) in creating the product (Kubiszyn & Borich, 2010; Janisch et ah, 2007; Etsey, 2005).

Another way in which performance assessment can be contextualised is to distinguish between response-centred and simulation-centred responses. In the latter case a practical (which can include a non-written) performance response is required by using special assessment instruments and equipment. This simulation-centred response format is more of a direct hands-on assessment in the sense that a close parallel exists between the actual performance that is observed in the construction of answers and the performance of interest. In the case of response-centred performance assessment the focus is on a learner-constructed response that can range from the simplest answer to comprehensive collections of work that was done over a period of time (Palm, 2008).

In short, performance-based assessment suggests that it requires learners to demonstrate knowledge, skills and competencies in performing or producing something that applies to a particular context. It can take many different forms that require learners to explore a topic orally or in writing, and where opportunities will allow learners to work individually or to become interactively engaged within a group. From the learners' activities and responses, it will enable the teacher to determine through observation and analysis what they know and what possible misconceptions they might hold regarding the purpose of the assessment (Moskal, 2003; Etsey, 2003).

For this article the word "alternative" must be seen in the context of innovative performance assessment strategies and instruments that are different from the traditional ones when assessing learners in the History class. With these assessment strategies the learners engage in enjoyable, interactive group activities and use physical activity and energy in finding solutions to historical enquiry questions. These assessment approaches originated as part of a human dynamics training programme, designed by one of the authors of this article to enhance team development and emotional intelligence within the corporate working environment. The activities were subsequently adapted to serve as community engagement and data collection tools for a participative community-engaged research project that focuses on skills training for History educators in various provinces of South Africa. In an attempt to enable the participants (all secondary school History teachers) to teach their discipline more creatively and effectively in the modern classroom, the activities were adapted further to serve as teaching, learning and assessment tools. They were then taken into the History classroom by trainee teachers as part of a mutually enabling service-learning engagement between an institution of higher learning and various schools (Janse van Rensburg, 2014). This provided the trainee teachers with first-hand experience of the potential value of the activities for their classroom teaching and for formative performance assessment in particular.

 

Theoretical underpinning

Alternative performance assessment strategies are grounded in a social-constructivist research paradigm (Cimer, 2018; Sardareh & Saad, 2012; Janisch et al., 2007) as they provide opportunities for collaborative and engaging learning where learners are given the chance to demonstrate what they know and to use their prior knowledge and skills to do further investigation and problem solving (Haun, 2018). It also fits within the framework of dialogue theory which values ethical communication and respect for individual dignity; involves participants in conversation and problem-solving; and encourages sharing and reaching mutual understanding (Taylor & Kent, 2014; Rule, 2011).

Emerging from the work of psychologists such as Jerome Bruner, Jean Piaget and Lev Vygotsky the most important implication of the constructivist theory on teaching, learning and assessment is the shift from teacher centred-instruction to learner-centred instruction. For Brooks & Brooks (1993) - and in line with David Kolb's theory on experiential learning (Kolb, 1984) - the constructivist view helps learners to actively construct, internalise and reshape, or transform new information. In this manner it breaks with the traditional view of teaching as a "mimetic" activity - a process that involves learners in repeating or miming newly presented information.

Social constructivism acknowledges the fact that learners possess a rich source of prior knowledge that they bring to the learning situation. Through collaboratively interacting with fellow learners (peers) in a dialogic learning environment (Rule, 2011; Taylor & Kent, 2014), and experimenting with a variety of performance assessment strategies learners learn from one another and are helped to actively construct and assimilate new knowledge and skills that are meaningful and useful in their own lives (Stears & Gopal, 2010).

Social constructivism further considers assessment as an on-going and continual process and is therefore formative in nature. It focuses on the role of social interaction and collaboration where learners are receiving feedback from their teachers and peers that facilitates, monitors and powerfully drives the learning process in raising learner achievement. Formative feedback processes that are supportive and motivating will help learners to progress to the next step in their learning (Sardareh & Saad, 2012).

 

Innovative alternative performance assessments in action

For the purpose of this study five interactive group activities were selected to serve as potential alternative performance assessment strategies for application in the History classroom. They are the "Paper Pool", "Deciding Line", "Shells/Stones Activity", "Paper Jets" and "Bubble Map", the last of which should be familiar to many teachers as a teaching tool but is utilised here specifically with an assessment objective in mind.

Paper Pool

In the "Paper Pool" activity an A4-size sheet of paper (preferably coloured paper which adds an element of colour and fun) is cut into four pieces and one piece given to each learner. The teacher then formulates a question based on the historical content selected for the lesson, and the learners write down their answers to the question without giving their names. The pieces of paper are subsequently placed upside down in the middle of the circle on the floor (or any other communal space such as a table). After the teacher has shuffled the pieces of paper, each learner collects a piece and voluntarily shares what the anonymous fellow learner has written. This will enable the teacher to determine the level of existing knowledge (if used as a "pre-test") or the knowledge gained during the lesson (if used as "post-test"). Giving feedback can continue for as long as the teacher decides. The teacher can either simply observe and form an impression of the knowledge level, understanding and/or perceptions of the learners, or follow a more advanced approach by posing questions of varying complexity, based on the responses that have been read out, thereby stimulating reflection and discussion. If space is limited in the classroom, the activity can be adapted quite easily to allow for pieces of paper to be collected by row, shuffled and then redistributed among learners sitting in another row.

The potential value of this activity is multi-faceted: firstly, it allows for the participation of every learner, including the quieter learner; secondly, learners save face by not reading out their own contributions and learn to communicate freely in an environment of emotional safety; thirdly, participation can build self-esteem and develop presentation and listening skills; in addition, the practical nature and visual impact of the activity will ensure that learners do not easily forget their experience and the learning that flowed from it; moreover, the activity does not require expensive resources, is easy to administer and generates quick results; lastly, the activity is versatile and may be used to assess content-knowledge at any stage of the lesson; explore learner perceptions on any relevant matter; and contribute towards the teacher's own self-assessment.

Deciding Line

In the "Deciding Line" activity, which can be conducted either inside (depending on space) or outside the classroom, a 3-meter piece of rope is placed on the ground creating a "negotiation zone" on the one side and a "consensus zone" on the other side of the rope. Learners are invited to form pairs and all pairs start out in the "negotiation zone" having to identify three to five main reasons for a certain historical phenomenon or characteristics of a particular leadership style. Once a pair has reached consensus, they cross the line to the "consensus zone", wait for another pair to join them and then move back to the "negotiation zone" in order to debate the points raised by each pair and come up with three to five points as a group of four. Once the group of four has reached consensus, they move to the "consensus zone", meet up with another group that has reached consensus and once again move back to the "negotiation zone" as a group of eight to debate their various contributions and reach consensus as a group. The process continues until the whole class reaches consensus and a representative of the group, or the teacher, finally jots down the three to five consensus findings on the blackboard or a flipchart.

Similar to the previous activity, the "Deciding Line" is conducive to full participation of every learner within a non-threatening dialogic space. They also practise communication, negotiation/debating and facilitation skills while having fun and will not easily forget the experience, and the learning that it generated. Again, the activity may be used to assess either content knowledge or learner perceptions, and can even be used by teachers to gain an impression of how learners experience their teaching by simply adapting the instruction question.

Shells/Stones Activity

The only resources that are needed for this activity are sea shells, small stones or any other item (e.g. sweets) of different sizes and appearance. The learners are invited to select a shell or stone each which they will be able to identify again later should the activity be repeated. In smaller classes with sufficient space, the learners can stand in a circle, while in more cramped settings the teacher can simply use his/her table as the surface where the activity can be executed. After the teacher has placed his/her shell/stone in the middle of the circle/table, and explained what this item signifies, the learners are invited to place their items closer or further away from the centre depending on how strongly they support what the central item represents. For example, if the teacher's shell/stone in the centre signifies a significant contribution on the part of Nelson Mandela in bringing about reconciliation in South Africa, learners can express their opinions non-verbally by placing their shells/stones either closer to the centre (if they feel Mandela was very successful) or further away towards the periphery if they believe that his attempts at reconciliation were not successful. The activity may be repeated a few times, each time asking a different question in order to assess depth of understanding and stimulate critical and analytical thinking.

Similar to the "Paper Pool" the teacher may simply observe the process and learn about the learners' perceptions, opinions or content knowledge. However, a more meaningful approach would be to engage the learners in conversation around the implications of the visual picture they have created by asking relevant questions and managing the communication process in order to ensure that learners sharing ideas and feelings feel respected and heard.

Again, every learner participates in the activity - even less confident learners who may find it easier to make a visual rather than a verbal statement in front of other people. If well facilitated by the teacher to counteract peer pressure, the activity also encourages learners to be honest and assertive whilst communicating their ideas and feelings. In addition, it provides them with opportunities to practice analytical and listening skills, and conveys the importance of respecting different opinions. Moreover, repeating the activity at a later stage will create useful opportunities for comparison, while the element of enjoyment and the visual impact of the activity will be remembered for a long time. This supports the view of another researcher (Riddell, 2016:73) who has found that the use of physical items in a practical formative assessment activity assists learners in connecting the objective of the lesson, and the learning derived from it, with their prior real world experience, which in turn strengthens their memory of the formative assessment experience.

As with the previous activities, the "Shells/Stones" activity may be used to assess both content knowledge and learner perceptions and opinions. More experienced teachers, who are emotionally ready to receive feedback from their learners about the strengths and developmental areas of their teaching, may consider using this activity as a powerful self-assessment tool.

Paper Jets

All that is needed for this activity is one A4 sheet of coloured paper per learner. Learners are invited to write down their views on any relevant question, after which they are shown how to fold their piece of paper into a paper jet, with the writing on the inside. The teacher now gives clear instructions, asking the learners to close their eyes and keep them closed (for safety reasons) until told to open them. The next instruction tells the learners to raise their arm with the jet in hand and prepare for take-off, after which they receive the cue "Take OfF, which allows them to throw their jets in any direction. Upon the instruction to open their eyes and "Scramble!", each learner has to find a jet, open it, read what the fellow learner has written, and then refold it for another round to commence. Learners may even be requested to tick items they agree with or comment in writing on what their classmates have said.

Apart from the high energy level and strong element of engagement which characterise this strategy, the activity again involves all learners, especially those who find it easier to make visual or written contributions rather than speaking up in a group. Moreover, while writing down their views at the beginning of the exercise, learners receive time to think about their responses. This takes learner feedback to a deeper level. If the teacher decides to stimulate discussion around the findings generated during the activity, which we encourage, learners learn to communicate ideas and feelings, practise analytical and listening skills and respect different opinions. Perhaps even more than any of the other activities discussed here, learners will remember this high energy activity for a long time. Again, by simply adapting the original instruction question, the teacher may use the activity to assess either content knowledge (for example the reasons for a particular historical development), or learner perceptions (for example their view of a historical figure's leadership style), or even as a self-assessment tool (for example requesting the learners to share their thoughts on how the teacher can make History lessons more effective).

Bubble Map

Here one of David Hyerle and Chris Yeager's thinking maps (Hyerle & Yeager,1996; Hyerle, 2011), the "Bubble Map", was adapted to serve as an alternative formative assessment tool. The only resources needed for the activity are koki pens and one sheet of flipchart paper. The teacher prepares a bubble map on the flipchart paper beforehand but leaves the bubbles blank. The flipchart sheet is then put up on the wall where learners have easy access to the bubble map and some koki pens. Learners are invited to jot down comments in the bubbles (e.g. reasons for a historical event, characteristics of a particular leadership style, successes/failures of a certain historical figure, comments on what they enjoy most about their teacher's teaching, etc.). Learners should be free to add more bubbles if necessary.

Based on extensive experience in the field of human dynamics training,2the potential value of this alternative performance assessment strategy is that learners will feel free to contribute in their own time and do so visually rather than orally which quieter personalities find intimidating. If the teacher decides to engage the learners in dialogue about their written contributions, learners will be able to practise analytical thinking, listening and other communication skills, learn to give and receive feedback from others, build self-confidence and respect opposing views. Should the bubble map remain displayed on the classroom wall for some time, it will assist in sustaining memory of the activity and may even serve as a source of reference and comparison in the future (Lubbe, 2016, Personal Archive, File 1).

 

Research methodology

This qualitative study has employed an action research methodology by integrating research and practice in alternative and innovative performance assessment strategies. These strategies were implemented in the classroom by trainee History teachers during their four- week practicum at schools in March 2016 in an attempt to make their History lessons more creative, enjoyable and effective, and to reflect on both the responses of their learners to these new techniques and their own experiences whilst conducting their classes. In this sense the study partly responded to an earlier plea for more training for teachers in formative assessment within an African context and for more research on the extent to which teachers actually benefit from formative assessment training (Perry, 2013).

In preparation for the practical implementation of the alternative performance assessment strategies described earlier, 33 History education students in their final year of study at a South African university were used as participants in this study. One of the authors of this article was lecturing these students, and therefore the population members were available to participate in the research. This sampling method is useful in exploratory research where the study is interested in attaining a low-cost, quick appraisal of "the truth" where only a few participants are necessary to complete the questionnaire (Maree, 2016).

The first phase of this qualitative study is based on research findings generated through observation by the authors, as well as written and informal oral feedback obtained from the participants during a practical workshop. This workshop prepared them for the second phase of the study during which the alternative and innovative performance assessment strategies would be implemented in the classroom. During the first phase of the research the participants attended a 90-minute interactive practical workshop presented by one of the authors during which they were familiarised with the five alternative performance assessment strategies outlined above. In order for the workshop to be fully experiential in nature and generate spontaneous responses from the participants, they were taken through each activity without prior explanation ("front-loading"), the nature and value of each activity reviewed orally within the group upon completion of the activity; and notes only handed out after the workshop. (Lubbe, 2016, Personal Archive, File 1).

The very first activity, the "Paper Pool", bears powerful testimony of the anxiety with which many students and learners associate summative and graded formative assessment. The participants were informed at the outset that they would be "tested" at the end of the workshop and that "the mark awarded would form a significant part of their final result for the academic year". Not only could the facilitators (authors) immediately sense tension in the room upon hearing this news, but they also observed shock in many facial expressions. This was subsequently confirmed in the participants' written responses when they were taken through the "Paper Pool" activity. Many mentioned feeling anxious, shocked, confused, insecure, overwhelmed and stressed either because of fear of the unknown, or because they felt unprepared and worried that they would fail the test. Others expressed emotions of anger, because they had not been told beforehand that the workshop would count for marks and wanted to leave (Lubbe, 2016, Personal Archive, File 2). During the review of the activity, participants were encouraged to talk through their negative feelings; made to appreciate the anxiety that is often associated with assessment; and understand the value of an alternative and innovative approach to formative assessment which removes such anxiety from the assessment process. In this way the study added to the work of other researchers such as Volante & Beckett, (2011), who emphasised the importance of learner involvement, appropriate questioning which alleviates tension, and feedback without grades as part of effective formative performance assessment.

After the workshop the participants received a set of notes (Lubbe, 2016, Personal Archive, File 1) which provided them with a step-by-step explanation of how each activity should be executed, a summary of the resources that would be required, suggestions for how each activity might be adapted for various purposes and different settings, options for application, and an indication of the potential value of each activity. They were also provided with a DVD (Lubbe, 2016, Personal Archive, File 1) which consisted of photographs and video footage of the practical execution of the activities during the workshop. The purpose of this visual material was to reinforce the learning in the weeks that followed, refresh participants' memory prior to implementation in the classroom, and standardise execution of the activities. By then all the students had already signed a written consent form (Lubbe, 2016, Personal Archive, File 1; Warnich, 2016, Personal Archive, File 5) in which they had given formal permission for their responses to be used for research purposes.

In addition, the students received an instruction sheet to be used in the second phase of this research. This instruction sheet required them to integrate any one of the suggested alternative performance assessment strategies into a CAPS-aligned lesson plan and to provide feedback on how the lesson went by completing a short questionnaire (Lubbe, 2016, Personal Archive, File 1). This questionnaire, which consisted of a few open-ended questions, required the participants to stipulate the assessment activity of choice; explain why that particular activity had been chosen; show how the chosen activity would be integrated into a CAPS-aligned lesson plan; discuss their experience of the practical implementation of the activity during the presentation of their lessons; provide an overview of their impressions by observing learner behaviour whilst the learners experienced and responded to the activity; and finally, to explain how they would approach and present the activity differently should they be offered another opportunity to do so (Warnich, 2016, Personal Archive, File 5). By offering the participants both a choice of activity and the flexibility to suggest adaptations for future implementation, the authors strove to cater for different teacher readiness levels and enhance participant creativity and motivation (NWEA, 2016). All of the above components - the set of notes, DVD, their written consent to participate in the research, lesson plan(s) and completed open-ended questionnaire - had to be included in a final portfolio which would contribute towards the students' final mark for the academic year (Warnich, 2016, Personal Archive, File 5).

Similar to the "Paper Pool" activity, which was linked to a theme that had personal relevance to each student for experiential impact, the remaining activities were also structured around relevant and easily accessible topics. For example, the "Deciding Line" challenged the students to identify five common objectives which they would like to see addressed during the workshop, while the "Stones/Shells" activity required them to assess Nelson Mandela's contribution in bringing about reconciliation in South Africa. In the "Paper Jets" activity, the participants were invited to express their views on whether or not History should become a compulsory school subject in South Africa, while the "Bubble Map" activity requested participants to evaluate the workshop by writing down comments on the workshop in the bubbles.

Based on the views expressed in the "Bubble Map" activity (Lubbe, 2016, Personal Archive, File 4), the participants described the workshop as "alternative", "interesting", "positive", "creative", "interactive", "fully participative", "innovative" and "effective" in terms of both the practical and visual impact of the assessment strategies and the group's interaction with one another. Several thought that the innovative and alternative performance assessment strategies would be a valuable and appropriate way of assessing learners at all levels and committed themselves to implementing the techniques in their teaching and learning of History (Lubbe, 2016, Personal Archive, File 4).

 

Research findings

In this qualitative study with its action research methodology, data was gathered through documenting the personal experience, observations and perceptions of trainee teachers during the practical implementation of innovative performance assessment activities as part of their lessons. The observation criteria used were: the practical feasibility of the performance assessment in a class situation; the learner's reaction to these "new" assessment activities; the extent of learner involvement during the assessment activities; and possible opportunities that were created for the learners to broaden their historical knowledge and skills.

Observation as a data-gathering technique is considered to be a systematic process used to record participant behaviour without necessarily asking questions or communicating. The aim is to gain a deeper insight into the phenomenon being observed. In view of the highly selective and subjective nature of observation, it is important to know exactly what to observe in an effort to eliminate personal bias (Nieuwenhuis, 2016). In this study the trainee teachers were guided to observe both the verbal and nonverbal expression of feelings on the part of the learners in relation to the mentioned criteria during the implementation of innovative performance assessment activities.

From the data gathered, the findings showed that most of the participants (58%) chose to implement the "Paper Pool" performance assessment strategy in their History classes. This was followed by "Paper Jets" (20%), "Shell/Stones" (12%), "Bubble Map" (8%) and "Deciding Line" (2%) (Warnich, 2016, Personal Archive, File 5).

The 58% participants who implemented the "Paper Pool" were in agreement that the activity was a good assessment strategy at the beginning of a new lesson. When applied, it is expected from the learners to anonymously write down an answer to a question on a piece of paper and then place it upside down in the middle of the circle on the floor. The participants felt that this action offers the teacher an excellent opportunity to reflect on the learner's prior knowledge on a specific topic. Other reasons specified by the participants why they favoured the "Paper Pool" as a "new" way of assessment, were that it was easy to understand; that in smaller classes it does not disturb class discipline, and that it encourages learner participation and interaction, which enable learners to learn from one another (Warnich, 2016, Personal Archive, File 5, Sections A1-A8, A11-A16, A20-A21).

In their experience with the "Paper Pool" the participants reported that none of the learners felt that they were put "on the spot". They were given the opportunity to write down their responses to the questions without stating their names on the piece of paper. When it was time to read the answers aloud, they were not afraid to do so, because even if an answer proved to be wrong, it would not be their own answer. The participants further reported that the learners were actively involved in the assessment process and that it was good to observe the cooperation between the learners (including the shy and quiet ones) whilst giving feedback to the questions. Moreover, the participants did not experience the "Paper Pool" as a time-consuming activity with little or no value. On the contrary, it was experienced as an assessment strategy that allowed the learners to communicate freely in class discussions which in turn created opportunities for peer learning in order to broaden their knowledge. At the same time, active involvement in the assessment process enhanced the learner's self-confidence (Warnich, 2016, Personal Archive, File 5, Sections A1-A9, A11, A14-A16, A18-A21).

There were, however, also a few challenges. Because the learners were not used to this alternative and innovative assessment strategy, it took some time before they finally grasped this "new" technique. Moreover, not all learners took this new assessment strategy seriously. In some instances, learners would shout out the name of a peer in class before the answer was read out, trying to make a joke of the assessment opportunity. For this reason, the participants indicated that effective classroom management is a prerequisite for the successful implementation of a performance assessment strategy of this nature (Warnich, 2016, Personal Archive, File 5, Sections A2, A8-A9, A11-A12, A14).

In providing an overview of their impressions of how the learners experienced and responded to the "Paper Pool", the participants shared some interesting observations. They reported that the learners had requested starting future lessons in the same way as they preferred to be more involved in the teaching, learning and assessment events. The learners also revealed a sense of enjoyment, anticipation, curiosity and excitement in a safe assessment environment which was not so stressful. They were not afraid to share their content knowledge and were willing to give feedback although their answers might be wrong. For many learners it was interesting to hear what their classmates wrote down as the answers to the question posed. On the other hand, despite answers being anonymous, not all learners liked the idea that their answers were read aloud (Warnich, 2016, Personal Archive, File 5, Sections A8-A9, A11, A14-15, A17, A19, A21).

When the participants were asked to explain how they would approach and present the "Paper Pool" differently, should they be offered another opportunity to do so, some indicated that they would spend more time on the activity as they found it difficult to complete it within the space of one period. Furthermore, they would reconsider the practice that allows learners to leave their desks in order to participate. In their opinion the movement of the learners to a communal space in the classroom disturbed the discipline. To avoid this, they would rather ask the learners to stay in their desks and exchange their answers with their peers in front, at the back or alongside them. Where there were too many learners in a classroom, some of the participants remarked that at another opportunity, they would take the learners out of class where there is enough space for them to sit in a circle (Warnich, 2016, Personal Archive, File 5, Sections A2, A8-A9, A11).

A further aspect that will be considered at a next opportunity, is to extend the implementation of the strategy to the teaching and learning and consolidation phases of a lesson instead of merely earmarking it for the introductory phase. When implemented in these phases, the participants were in agreement that a lesson might be over two periods in order for the full potential of the "Paper Pool" as a formative performance assessment strategy to be realised. Finally, it was stated that at a next opportunity more time would be spent on the sharper formulation of the questions posed. Most of the participants believed that, because the question had not always been formulated clearly, the learners tended to deviate from the lesson topic during the discussions after the question had been answered (Warnich, 2016, Personal Archive, File 5, Sections A9-A10, A15, A21).

As far as the "Paper Jets" activity is concerned, the 20% participants who decided to implement it as an alternative and innovative performance strategy, held the view that it teaches the learners some aspects of the "doing" of History. Only one participant utilised this assessment strategy in the Further Education Training (FET) Phase, whilst the others applied it to the Intermediate and Senior Phases of the History component of Social Sciences (Grades 4-9). During the implementation of the "Paper Jets" the participants found that the learners (the boys more than the girls) thoroughly enjoyed the activity, especially the folding of the jets and to participate in this "new" and active way of assessment. The activity generated a general sense of excitement in class which could be ascribed mainly to the "playful" element of the "Paper Jets". For this reason, the learners did not experience it as an assessment activity in the true sense of the word. To them it was important not to miss out, and appreciating the writing down of their own answers in anonymity, rather than to say it aloud in class. Every learner therefore participated actively in the assessment process by reading the written answers of the other learners and sharing ideas during the class discussion that flowed from the answers given to the questions. One of the participants also reported that anonymity created an opportunity for the learners to assess and discuss incorrect answers without pointing out certain individuals. In this manner the formative feedback contributed to improved learning in the acquisition of historical knowledge and skills (Warnich, 2016, Personal Archive, File 5, Sections B1-B3, B5-B8).

However, most of the trainee teachers did not enjoy the "Paper Jets" as much as the learners did. One of the challenges they faced during the implementation of the "Paper Jets" was the folding of the jet. Some of the learners mocked and teased those who struggled to fold the jet. One of the participants observed that it was in particular the girls who found folding the jets difficult. Another challenge relates to classroom management. Most of the participants agreed that the "Paper Jets" had a detrimental effect on class discipline. Especially during the "scramble" part of the activity the learners were rowdy and pushed each other in their search to find a jet. Under these circumstances the participants reported that they found it difficult to maintain discipline, and to refocus the learners' attention on the rest of the lesson after the assessment activity had ended. Most of the participants felt that, to a certain extent, it distracted the learners' realisation of the real purpose and aim of "Paper Jets" as a performance assessment strategy (Warnich, 2016, Personal Archive, File 5, Sections B4-B6, B8).

In their comments on how they would approach and present the "Paper Jets" differently should they be offered another opportunity to do so, some participants remarked that the assessment activity took too much time and that they would therefore ask the learners to finish folding their jets at home. Others stated that they would handle the "scramble" part differently by allowing the learners to throw their jets in smaller numbers and boys and girls to do it separately outside the classroom. Some of the participants reported that they would only use the "Paper Jets" as an assessment strategy in the concluding phase of their lessons due to the reaction of the learners (Warnich, 2016, Personal Archive, File 5, Sections B2-B7).

In their experience of the "Shell/Stones" activity, the 12% participants who had chosen to implement it in class were unanimous that it was an excellent assessment strategy. They reported that the learners showed keen interest in this alternative and interactive way of assessment, which kept everybody's attention from start to finish. Even the quiet and shy learners were willing to participate, and all learners were immediately part of the assessment process when asked to make a visual judgment by placing their shells/stones closer or further away from the centre of the circle. Depending on the distance from the centre, the placement of the shell/stone indicates the level of agreement with the statement made by the teacher. A further advantage highlighted was the opportunity created for learners to explain the placement of their shells/stones, thereby encouraging them to form own opinions which will contribute to their development of a critical awareness and historical understanding of events. Some of the participants argued that the "Shell/Stones" would be particularly effective as an alternative performance assessment strategy in the introductory phase of a lesson. Moreover, they were of the opinion that in contrast to the other mentioned performance assessment strategies, the "Shells/Stones" activity is more effective when testing for a deeper prior knowledge on a specific topic Warnich, 2016, Personal Archive, File 5, Sections C2-C5).

Despite the general agreement on the merit of the "Shells/Stones", some challenges were encountered. One of the participants pointed out that this strategy was more suited to the Senior and Further Education and Training (FET) phases, as the intermediate phase learners are still too young to have sufficient knowledge on a specific topic in order to develop an own opinion. Another challenge was that some of the learners' decisions where to place their shells/stones in relation to the centre of the circle were influenced by their peer's placement of their shells/stones. This behaviour may suggest a lack of confidence in taking a decisive stand with regard to the statement made and subsequently motivating why the choice has been made (Warnich, 2016, Personal Archive, File 5, Sections C1, C3-C4).

One participant recommended that, should there be an ensuing opportunity for implementation, the learners could be divided into smaller groups depending on where they have placed their shells/stones. Those learners who put their shells/stones nearest to the centre (thereby implicating that they are more in agreement with the statement made) would be grouped together in order to work together to formulate a view. The same opportunity would be given to those learners who put their shells/stones further away from the centre, demonstrating that they were to a lesser extent in agreement with the statement made (Warnich, 2016, Personal Archive, File 5, Section C4).

As far as the "Bubble Map" is concerned, the 8% participants who implemented it agreed that this assessment activity is suitable for implementation in any of the introductory, presentation and consolidating phases of a lesson. They pointed out that the learners clearly found this "new" way of assessment exciting and stimulating and requested more opportunities in the future to partake in an assessment activity of this nature. Some of the learners were even willing to do some research for homework in preparation, should the opportunity rise again (Warnich, 2016, Personal Archive, File 5, Sections D1-D3).

However, some of the participants also raised concerns and made suggestions for adaptation. They argued that the "Bubble Map" activity as an assessment strategy would work better in smaller groups as it encouraged more discussion among the learners. Furthermore, they experienced the activity as time consuming as a result of the lengthy debates that it generated among the learners. For this reason, it was recommended that the activity should rather be utilized during a double period, should they be offered another opportunity to present the "Bubble Map" activity differently. Some of the participants also experienced that they had to intervene at times when the cognitively stronger and/or more vocal learners tended to over- power the quieter learners during the dialogues that followed after they had jotted down their comments in the bubbles (Warnich, 2016, Personal Archive, File 5, Sections D1-D3).

Due to the fact that only 2% of the participants implemented the "Deciding Line" shows that this assessment strategy was not a popular choice. However, the participant who did use it, reported that it worked extremely well, and that it was especially the "busy" learners who enjoyed the interactivity and physical movement that the activity offered. The learners also spread the message in school that the trainee teacher had "funky ways" of teaching the content. Some of the learners recalled the assessment activity as learning History through a hands-on process of inquiry and debate, rather than through the rote memorisation of facts. In terms of potential challenges in the execution of this performance assessment strategy, it was reported that the strategy was to a certain extent hamstrung by space limitations in class. In addition, some learners moved the rope when not lifting their feet high enough. It was therefore recommended that in future colourful duct tape stuck to the floor should be used instead of a loose rope (Warnich, 2016, Personal Archive, File 5, Section E1).

 

Discussion

From the research findings it is apparent that the trainee teachers' experience of the implementation of their innovative performance strategies of choice was largely positive in relation to the criteria stipulated earlier.

The trainee teachers found these assessment strategies a welcome deviation from more formal methods of assessment and were inspired by the element of excitement that the activities brought to their History lessons. The "playful" element in particular created an eagerness on the part of the learners to be part of the activities which they did not perceive as "assessment". More importantly, because the assessment strategies did not betray the identities of the learners, quieter learners felt safe enough to become involved, thereby ensuring maximum learner participation. This lack of fear of assessment, as observed by the trainee teachers, supports the arguments of earlier researchers (Bartlett, 2015; Von der Embse & Hasson, 2012; Hesketh et al, 2010; Jun & Xing, 2010; McDonald, 2001) with regard to the role of anxiety caused by summative and graded forms of formative assessment. It also practically illustrates the value of creative ways of implementing non-graded formative alternative performance assessment strategies, which, as has been argued by Bayat, Jamshidipour and Hashemi (2017), can reduce anxiety and make learning much more enjoyable.

As confirmed in the research of scholars such as Muttaqin (2016) and Quinn (2006), active participation in teaching, learning and assessment activities strengthened the learners' self-confidence and self-esteem which made them more willing to share ideas and learn from one another during class discussions. Another advantage was that the trainee teachers thought that these assessment strategies could be effectively applied to any phase of the lesson. In general, the research results of this study reinforce wide agreement in the literature that interactive and collaborative assessment activities can be very beneficial, as they create opportunities for the learners to learn from one another, and in doing so, construct and assimilate new knowledge; increase student motivation, participation and retention; develop social skills; enhance a team approach to problem-solving; develop self-management skills; create opportunities for peer learning; and strengthen interpersonal relations (Kennedy-Clark, Kearney & Galstaun, 2017; Hargreaves, 2007; Steadman, 1998).

Interestingly, not many trainee teachers chose to implement the "Deciding Line" and "Bubble Map" as part of their lessons. A possible reason for this could be the simplicity of activities such as the "Paper Pool" which, according to participant feedback, was easier to prepare and present in class than, for example, the "Deciding Line". The latter activity is cognitively more advanced in that it requires and develops negotiation skills with which the trainee teachers may not have been familiar. This correlates with research findings by Gijbels and Dochy (2006) and Nijhuis, Segers & Gijselaers (2005) that students' (trainee teachers') preferences for assessment activities with higher-order thinking tasks are significantly lower than [for those assessing lower-order thinking]. The "Paper Pool" is also more challenging to master in terms of group control and giving clear instructions. Similarly, the "Paper Jets" was not a popular performance strategy of choice - although it usually adds great excitement and enjoyment to any group where it is implemented - and was only implemented by one participant in the FET phase. This could perhaps be ascribed to the fact that the folding of the jets may be time-consuming and challenging.

However, simplicity could not have been the only reason why certain activities were chosen and others not. The "Bubble Map", for example, is a relatively simple activity which holds great potential as an alternative performance assessment strategy. Yet, only three trainee teachers chose to implement it. One of the participants reported that although it worked well in smaller groups, the more vocal learners tended to dominate the quieter ones. The participants were further in agreement that the "Bubble Map" was too time consuming for a single period, and that a double period would be necessary to do justice to this activity (Warnich, 2016, Personal Archive, File 5, Sections D1-D3).

Another reason why the "Bubble Map" was not a particularly popular choice could be that participants who had to leave the preparatory workshop early as a result of other lecture commitments, did not personally experience the activity. Lack of prior exposure to the use of thinking maps as teaching and assessment tools, may also assist in explaining the choices that the trainee teachers made.

In terms of challenges faced during implementation, the research findings indicate that time management, lack of classroom space and classroom management were the major concerns of the trainee teachers. They found some activities (for example the "Paper Jets") as well as the lengthy discussions after an activity had been completed, time-consuming. In some cases, the play element caused the learners not to take the activity seriously thereby compromising class discipline (Warnich, 2016, Personal Archive, File 5, Sections B2, B4, B6).

Some of these concerns resonate with the findings of prior research (Alias, Hussein, Hassan, Adnan, Othman & Hussein, 2018; Le, Janssen & Wubbels, 2018; Izci, 2016; Box, Skoog & Dabbs, 2015; Sach, 2015; Robinson, Myran, Strauss & Reed, 2014; Chiriac & Granström, 2012; Hämäläinen & Vähäsantanen, 2011), which indicated that teachers who shy away from interactive and collaborative assessment design methodologies tend to blame lack of teaching experience, lack of resources, limited knowledge of alternative assessment methods, and lack of self-confidence which, in turn, creates fear of loss of control in the classroom.

 

Recommendations

Although greatly encouraged by the positive feedback received from the trainee teachers on this research study, which holds great promise for wider application by History teachers, the results of this pilot study with its small sample size cannot be generalised. The authors also believe that some of the concerns of the trainee teachers could be addressed by redesigning the preparation phase of the study.

From hindsight it is clear that 90 minutes was just enough time to familiarise participants with the basic characteristics of the various assessment strategies and to create opportunities for them to experience first-hand the potential value of each activity. More time allocated to the preparation phase would have been very useful. Firstly, it would have enhanced reflection and discussion around challenges that could be expected during lesson presentation. Secondly, the participants could have been alerted to alternative ways in which each activity could be presented and reviewed (Dixson & Worrell, 2016). Thirdly, more time would have enabled the presenter to equip participants with techniques for the successful facilitation of instructional dialogues (Ruiz-Primo, 2011; Deiglmayr, 2018). Such facilitation skills would emphasise good listening on the part of the teacher - especially listening for learning progression instead of simply classifying answers as right or wrong (Gotwals, 2018) - the creation of a trusting classroom environment and the skilful management of the dialogic space by balancing recall questions with higher-order questioning in the interest of achieving deep learning (Jiang, 2014). Basic facilitation training is also essential in teaching participants how to give feedback which is not emotionally damaging to the learner (Torrance, 2012), maintaining the correct focus during group discussions, managing time effectively, managing high energy levels and the behaviour of talkative learners who tend to dominate communication; and handling quieter learners in the interest of sustained involvement of every individual.

The DVD with footage of the activities in action during the preparation phase was intended to reinforce learning and refresh the memory of the participants just prior to implementation. The purpose of this visual material was also to assist in "standardizing" execution which would be essential from a research perspective. Although the trainee teachers did not express such a need, the authors are of the opinion that a more comprehensive video, depicting a step-by-step approach to each activity, could have been more helpful and should be considered should the study be repeated in the future.

Lastly, the trainee teachers would have benefited from repeating a lesson with different groups of learners and, as expressed elsewhere in the literature (NWEA, 2016), from regular reflection and interaction with one another. Seeing that teachers normally need ample time and strong professional support in order to become competent users of formative assessment (Bennett 2011), the trainee teachers who participated in this study would also have benefited from intermittent discussions with the researchers who could have offered guidance and provided opportunities to share experiences. Moreover, requesting the trainee teachers to also assess the support they received from mentor class teachers and comment on the classroom context that they encountered at the schools where they taught, would have generated very valuable additional research data which could have assisted in contextualising the core findings of the study.

 

Conclusion

This article has added five innovative alternative performance assessment strategies to the repertoire of History teachers and reflected on the experiences and perceptions of trainee teachers during implementation of these strategies in the History classroom. In response to the research questions, it has shared the generally positive experience of the trainee teachers and their perceptions of the educational value of their assessment strategy(ies) of choice. It has also documented the challenges that some of the participants experienced during the implementation phase, and shared their creative adaptations with fellow teachers and future researchers. In addition, the article has identified limitations in the study in terms of the initial preparation and ongoing support of the participants, making recommendations in this regard which may assist future researchers embarking on similar research studies. Nevertheless, the research findings of this study provide ample evidence that the trainee teachers coped satisfactorily with the practical implementation of the newly acquired alternative formative assessment strategies. Finally, the findings suggest that a fresh approach to formative assessment, which is non-graded, avoids putting the individual learner on the spot and integrates an element of enjoyment, holds great educational value as it effectively removes anxiety from the assessment process. The article therefore encourages teachers to make time for experimenting with creative, engaging yet effective formative assessment techniques amidst the constraints of a full syllabus and the stringent assessment requirements that CAPS imposes on teachers, in order to bring History to life in the classroom.

 

Reference list

Alias, NS, Hussein, H, Hassan, J, Adnan, NSM, Othman, MH and Hussein, K 2018. Perception of teacher on cooperative learning. In Matec Web of Conferences, 150, 05068. Available at https://www.matec-conferences.oig/articles/matecconf/pdf/2018/09/matecconf_mucet2018_05068.pdf. Accessed on 28 April 2019.         [ Links ]

Bartlett, J 2015. Outstanding assessment for learning in the classroom. New York: Routledge.         [ Links ]

Bayat, A, Jamshidipour, A and Hashemi, M 2017. The beneficial impacts of applying formative assessment on Iranian University students' anxiety reduction and listening efficacy. International Journal of Languages Education and Teaching 5(2):1-11.         [ Links ]

Bennett, RE 2011. Formative assessment: A critical review. Assessment in Education: Principles, Policy & Practice, 18(1):5-25.         [ Links ]

Box, C, Skoog, G and Dabbs, JM 2015. A case study of teacher practice assessment theories and complexities of implementing formative assessment. American Education Research Journal, 52(5):956-983.         [ Links ]

Brooks, JG and Brooks, MG 1993. In search of understanding: The case for constructivist classrooms. Alexandria, VA: Association of Supervision and Curriculum Development.         [ Links ]

Bunt, BJ 2013. The extent to which teachers nurture creative thinking in the grade 9 Social Sciences classroom through the choice of teaching methods. Unpublished MEd dissertation. Vanderbijlpark: North-West University (Vaal Triangle Campus).         [ Links ]

Careless, D 2015. Excellence in University Assessment. New York: Routledge.         [ Links ]

Chiriac, EH and Granström, K 2012. Teachers' leadership and students' experience of group work. Teachers and Teaching: Theory and Practice, 18(3):345-363.         [ Links ]

Cimer, SO 2018. What makes a change unsuccessful through the eyes of teachers. International Education Studies,11(1):81-88.         [ Links ]

Deiglmayr, A 2018. Instructional scaffolds for learning from formative peer assessment: Effects of core task, peer feedback, and dialogue. European Journal of Psychology Education, 33:185-198.         [ Links ]

Demircioglu, IH 2010. Empirical research on History education in Turkey: An overview of key issues, methods and outcomes. Jahrbuch, Internationale Gesellschaft Für Geschichtsdidaktik.         [ Links ]

Department of Basic Education (DBE), 2011. CAPS, History Further and Training Phase Grades 10-12. Pretoria: Government Printer. Available at: https://www.education.gov.za/Portals/0/CD/National%20Curriculum%20Statements%20 and%20Vocational/CAPS%20FET%20%20HISTORY%20GR%2010-12%20 %20WeB.pdf?ver=2015-01-27-154219-397. Accessed on 31 August 2018.         [ Links ]

Dixson, DD and Worrell, FC 2016. Formative and summative assessment in the classroom. Theory Into Practice, 55(2):153-159.         [ Links ]

Duncan, T and Buskirk-Cohen, AA 2011. Exploring learner centered assessment: A cross-disciplinary approach. International Journal ofTeaching and Learning in Higher Education, 23(2):246-259.         [ Links ]

Edmunds, J 2006. How to assess student performance in history: Going beyond multiple-choice test. SERVE Center at the University of North Carolina: Greensboro. Available at: https://files.eric.ed.gov/fulltext/ED513873.pdf. Accessed on 16 September 2019.         [ Links ]

Etsey, KA 2005. Assessing performance in schools: Issues and practice. Ife Psychologia, 13(1):123-135.         [ Links ]

Frunza, V 2014. Advantages and barriers of formative assessment in the teaching-learning activity. Procedia, 114:452-455.         [ Links ]

Furtak, EM, Kiemer, K, Circi, RK, Swanson, R, De León, V, Morrison, D and Heredia, SC 2016. Teachers' formative assessment abilities and their relationship to student learning: Findings from a four-year intervention study. Instructional Science, 44(3):267-291.         [ Links ]

Gijbels, D and Dochy, F 2006. Students' assessment preferences and approaches to learning: Can formative assessment make a difference? Educational Studies, 32(4):399-402.         [ Links ]

Gotwals, AW 2018. Where are we now? Learning progressions and formative assessment. Applied Measurement in Education, 31(2):157-164.         [ Links ]

Hämäläinen, R and Vähäsantanen, K 2011. Theoretical and pedagogical perspectives on orchestrating creativity and collaborative learning. Educational Research Review, 6(3):169-184.         [ Links ]

Hamidi, E 2010. Fundamental issues in L2 Classroom Assessment Practices. Academic Leadership: The Online Journal, 8(2). Available at: https://scholars.fhsu.edu/alj/vol8/iss2/21. Accessed on 10 July 2019.         [ Links ]

Hargreaves, E 2007. The validity of collaborative assessment for learning. Assessment in Education, 14(2):185-199.         [ Links ]

Haun, B 2018. Making performance assessment a part of accountability. Education Policy Analysis Archives (Special Issue), 26(15):1-5.         [ Links ]

Hesketh, T, Zhen, Y, Lu, L, Dong, Z, Jun, YX and Xing, ZW 2010. Stress and psychosomatic symptoms in Chinese school children: Cross-sectional survey. Archives of Disease in Childhood, 95(2):136-140.         [ Links ]

Hyerle, D 2011. Student Success with Thinking Maps. 2nd edition, Thousand Oaks: Corwin Press.         [ Links ]

Hyerle, D and Yeager, C 1996. Thinking Maps: Seeing is understanding, Educational Leadership, 53 (4):85-89.         [ Links ]

Izci, K 2016. Internal and external factors affecting teachers' adoption of formative assessment to support learning. International Journal of Social, Behavioral, Educational, Economic, Business and Industrial Engineering, 10(8):2541-2548. Available at https://files.eric.ed.gov/fulltext/ED567798.pdf. Accessed on 4 February 2019.         [ Links ]

Janisch, C, Liu, X and Akrofi, A2007. Implementing alternative assessment: Opportunities and obstacles. The Educational Forum, 71(Spring):221-230.         [ Links ]

Janse van Rensburg, E 2014. Enablement - A foundation for community engagement through service learning in Higher Education. In: M Erasmus & R Albertyn, (eds.). Knowledge as enablement: Engagement between higher education and the third sector in South Africa. Bloemfontein: Sun Media.         [ Links ]

Jiang, Y 2014. Exploring teacher questioning as a formative assessment strategy. RELC Journal, 45(3):287-304.         [ Links ]

Kennedy-Clark, S, Kearney, S & Galstaun, V 2017. Using collaborative assessment design to support student learning. Education Sciences, 7(4):1-14.         [ Links ]

Killen, R. 2007. Teaching strategies for outcomes-based education. 2nd ed. CapeTown: Juta.         [ Links ]

Kolb, DA 1984. Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall.         [ Links ]

Kubiszyn, T and Borich, G 2010. Educational testing & measurement: Class application and practice. 9th edition, United States of America: John Wiley & Sons, INC.         [ Links ]

Lambert, D and Lines, D 2000. Understanding assessment, purposes, perceptions, practice. London: RoutledgeFarmer.         [ Links ]

Lau, AMS 2016. 'Formative good, summative bad?' - A review of the dichotomy in assessment literature. Journal of Further and Higher Education, 40(4):509-525.         [ Links ]

Le, H, Janssen, J and Wubbels, T 2018. Collaborative learning practices: Teacher and student perceived obstacles to effective student collaboration. Cambridge Journal of Education, 48(1):1-20.         [ Links ]

Lee Hang, DM and Bell, B 2015. Written formative assessment and silence in the classroom. Cultural Studies of Science Education, 10(3):763-775.         [ Links ]

Lubbe, HJ 2016. Personal Archive, File 1 (Workshop notes, written consent form, DVD, instruction sheet; questionnaire); File 2 (Participant feedback - Paper Pool); File 3 (Participant feedback - Paper Jets); File 4 (Workshop assessment -Bubble Map).         [ Links ]

Maree, K (Ed.) 2016. First steps in Research, 2nd edition. Pretoria: Van Schaik.         [ Links ]

McCurdy, K, Reagan, EM, Rogers, A and Schram, T 2018. Integrating performance assessment across a PK-20 continuum: A locally developed collaboration. Education Policy Analysis Archives (Special Issue), 26(14):1-11.         [ Links ]

Mcdonald, AS 2001. The prevalence and effects of test anxiety in school children. Educational Psychology, 21(1):89-101.         [ Links ]

McMillan, JH and Hearn, J 2008. Student self-assessment: The key to stronger student motivation and higher achievement. Educational Horizons, 87(1):40-49.         [ Links ]

Mooreng, BB and Du Toit, E 2013. The powerful learning environment and history learners in the Free State Province. Yesterday&Today, 9:45-66.         [ Links ]

Moskal, BM 2003. Recommendations for developing classroom performance assessments and scoring rubrics. Practical Assessment, Research & Evaluation, 8(14). Available at https://pareonline.net/getvn.asp?v=8&n=14. Accessed on 2 February 2019.         [ Links ]

Muttaqin, T 2016. Cooperative learning and students' self-esteem. Working Paper. Available at file:///C:/Users/12923079/Downloads/CooperativeLearningand StudentSelf-Esteem%20(2).pdf. Accessed on 1 June 2019        [ Links ]

Nieuwenhuis, J 2016. Qualitative research designs and data gathering techniques. In: K Maree (ed.). First steps in reseach (2nd edition). Pretoria: Van Schaik.         [ Links ]

Nijhuis, JFH, Segers, MSR and Gijselaers, WH 2005. Influence of redesigning a learning environment on student perceptions and learning strategies. Learning Environment Research, 8(1):67-93.         [ Links ]

NWEA, 2016. How to make formative assessment a habit: Beyond the classroom practices:1-7. Available at https://files.eric.ed.gov/fulltext/ED567831.pdf. Accessed on 2 April 2019.         [ Links ]

Offerdahl, EG and Tomanek, D 2011. Changes in instructors' assessment thinking related to experimentation with new strategies. Assessment and Evaluation in Higher Education,, 36(7):781-795.         [ Links ]

Palm, T 2008. Performance assessment and authentic assessment: A conceptual analysis of the literature. Practical Assessment, Research & Evaluation,13(4):1-11.         [ Links ]

Perry, L 2013. Review of formative assessment use and training in Africa. International Journal of School and Educational Psychology, 1(2):94-101.         [ Links ]

Quinn, P 2006. Corporative learning and student motivation. Master's Theses. The college at Brockport: New York. Available at https://digitalcommons.brockport.edu/ehd_theses/285/?utm_source=digitalcommons.brockport.edu%2Fehd_ theses%2F285&utm_medium=PDF&utm_campaign=PDFCoverPages. Accessed on 1 June 2019.         [ Links ]

Riddell, NB 2016. Maximising the effective use of formative assessments. Teacher Educators Journal, 9: 63-74.         [ Links ]

Robinson, J, Myran, S, Strauss, R and Reed, W 2014. The impact of an alternative professional development model on teacher practices in formative assessment and student learning. Teacher Development, 18(2):141-162.         [ Links ]

Ruiz-Primo, MA 2011. Informal formative assessment: The role of instructional dialogues in assessing students' learning. Studies in Educational Evaluation, 37:15-24.         [ Links ]

Rule, P 2009. Bakhtin and Freire: Dialogue, dialectic and boundary learning. Educational Philosophy and Theory, 43(9):924-942.         [ Links ]

Sach, E 2015. An exploration of teachers' narratives: What are the facilitators and constraints which promote or inhibit 'good' formative assessment practices in schools? International Journal of Primary, Elementary and Early Years Education, 43(3):322-335.         [ Links ]

Samuelsson, J 2018. History as performance: Pupil perspectives on history in the age of 'pressure to perform'. Education 3-13, 47(3):333-347.         [ Links ]

Sardareh, SA and Saad, MRM 2012. A sociocultural perspective on assessment for learning: The case of a Malaysian primary school ESL context. Procedia, 66:343-353.         [ Links ]

Steadman, M 1998. Using classroom assessment to change both teaching and learning. New Directions for Teaching and Learning, 75:23-35.         [ Links ]

Stears, M and Gopal, N 2010. Exploring alternative assessment strategies in Science classrooms. South African Journal of Education, 30(4):591-604.         [ Links ]

Stosich, EL, Snyder, J and Wilczak, K 2018. How do states integrate performance assessment in their systems of assessment? Education Policy Analysis Archives (Special Issue), 26(13):1-31.         [ Links ]

Taylor, M and Kent, ML 2014. Dialogic engagement: Clarifying foundational concepts. Journal of Public Relations Research, 26:384-398.         [ Links ]

Torrance, H 2012. Formative assessment at the crossroads: Conformative, deformative and transformative assessment. Oxford Review of Education, 38(3):323-342.         [ Links ]

Vander Ark, T 2013. What is performance assessment? Available at http://www.gettingsmart.com/2013/12/performance-assessment. Accessed on 20 February 2019.         [ Links ]

Volante, L and Beckett, D 2011. Formative assessment and the contemporary classroom: Synergies and tensions between research and practice. Canadian Journal of Education, 34(2):239-255.         [ Links ]

Von der Embse, N and Hasson, R 2012. Test anxiety and high-stakes test performance between school settings: Implications for educators. Preventing School Failure, 56(3):180-187.         [ Links ]

Warnich, P 2016. Personal Archive, File 5, Sections A-E (Consent forms, questionnaires, portfolios and data collected).         [ Links ]

Warnich, P and Meyer, L 2013. Trainee teachers' observation of learner-centred instruction and assessment as applied by History and Social Sciences teachers. Yesterday & Today, 9:13-43.         [ Links ]

Wilson, SM & Wineburg, SS 1993. Wrinkles in time and place: Using performance assessments to understand the knowledge of history teachers. American Educational Research Journal, 30(4):729-769.         [ Links ]

Zhao, XU, Selman, RL & Haste, H 2015. Academic stress in Chinese schools and a proposed preventive intervention program. Cogent Education, 2(1):1-14.         [ Links ]

 

 

1 CAPS (Curriculum Assessment Policy Statement) refers to the South African National Department of Basic Education's policy document for each school subject. This document sets out guidelines regarding learning material, assessment and the expected outcomes in a particular subject.
2 One of the authors is both an academic historian and an experienced human dynamics facilitator.

Creative Commons License All the contents of this journal, except where otherwise noted, is licensed under a Creative Commons Attribution License