SciELO - Scientific Electronic Library Online

 
vol.37 número4Teacher advocacy for the enhancement of professional learning and development in continuous professional teacher development programmesA critical review of the role of responsive curricula in optimising learning in higher education índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Em processo de indexaçãoSimilares em Google

Compartilhar


South African Journal of Higher Education

versão On-line ISSN 1753-5913

S. Afr. J. High. Educ. vol.37 no.4 Stellenbosch Set. 2023

http://dx.doi.org/10.20853/37-4-5674 

GENERAL ARTICLES

 

Development of an assessment instrument for student engagement in design thinking projects for health innovation

 

 

K. DikgomoI; S. HendricksII, III, IV; T. E. M. MutsvangwaV

ISchool of Medicine, Faculty of Health Sciences, Nelson Mandela University Missionvale Campus, Gqeberha, South Africa https://orcid.org/0000-0001-5875-9077
IIDivision of Physiological Sciences, Department of Human Biology University of Cape Town, South Africa
IIICarnegie Applied Rugby Research (CARR) Centre, Carnegie School of Sport Leeds Beckett University, Leeds, UK
IVHealth Through Physical Activity, Lifestyle and Sport Research Centre (HPALS), Department of Human Biology University of Cape Town, South Africa. https://orcid.org/0000-0002-3416-6266
VDivision of Biomedical Engineering, Department of Human Biology University of Cape Town, South Africa https://orcid.org/0000-0003-1210-2832

 

 


ABSTRACT

Student engagement is a dynamic and multifaceted concept encompassing physical, emotional, and cognitive components. Various instruments to assess student engagement exist; however, these are not intended to assess how students engage with one another and with community stakeholders in participatory health projects. Although instruments exist to assess participation and power-sharing in participatory health research projects, none of the available tools are suitable for assessing student engagement in such projects. Accordingly, this study set out to develop an assessment instrument for student engagement in design thinking projects for health innovation. An adapted form of the survey development guide for medical education research was applied. The development process included triangulation of data, which included collating student input from an initial literature informed instrument, an analysis of written reflective reports and a focus group discussion with students enrolled in a master's level course called Health Innovation & Design (HID), and design thinking practitioner validation. A final assessment instrument for student engagement in design thinking projects is presented. Note that our instrument incorporates the design thinking phases according to the Innovation Design Engineering Organization (IDEO) design thinking approach, an educational definition of student engagement, and recommendations by students, course lecturers and facilitators of the HID course. The instrument can assess engagement in academic and non-academic settings when design thinking is applied for health innovation.

Keywords: assessment of student engagement, design thinking, pedagogy, engaged scholarship, health innovation


 

 

INTRODUCTION

Participatory health research is regarded as an effective strategy to improve end-user or community conditions (Goodman et al. 2017). Numerous instruments to assess participation in health research exist - for example, Arnstein (1969) uses an 8-rung ladder of citizen participation to determine "the extent of citizens' power in determining the end product", while Rifkin, Muller, and Bichmann (1988) as well as Draper, Hewitt, and Rifkin (2010) use a 5-point scale to rate participation and represent the data on a spider's web (spidergram). These instruments are used to assess power-sharing or the inclusion of the community or stakeholders in the research process. While the instruments can succeed in assessing power-sharing, they have limitations when applied to specific forms of participatory projects (e.g., design thinking projects) or the educational setting. These limitations include an inability to holistically assess students' behavioural, emotional, and cognitive engagement, and phasic requirements of participatory projects.

Overview of design thinking and its utility in health and education

Design thinking is a systematic process characterised by iterative phases and consistent involvement of the end-user to develop contextually relevant and practical solutions (Roberts et al. 2016). For the design thinking method to be successful, several elements are required, including a diverse and multidisciplinary group, collaboration, and multiple perspectives and backgrounds (McLaughlin et al. 2019).

There are two popular versions of the design thinking method - the Innovation Design Engineering Organization (IDEO) approach and the Hasso Plattner Institute of Design at Stanford University (d.school) approach. The IDEO approach consists of three phases: inspiration, ideation, and implementation (Brown and Wyatt 2010). The inspiration phase provides an opportunity to identify and understand the user, their context, and their challenges. Following this understanding, ideas are developed and tested in the ideation phase. Solutions become the "action plan" during the implementation phase. The Stanford d.school approach consists of five phases, namely: empathize, define, ideate, prototype, and test (Doorley et al. 2018). Similar to the inspiration phase of the IDEO method, the empathize phase provides an opportunity to identify and understand the user, their context and their challenges. In the define phase, a problem statement that will guide activities in subsequent phases is developed. Ideas are then generated in the ideate phase, and low fidelity prototypes are developed in the prototype phase. These prototypes are tested in the testing phase.

Design thinking methodology has been applied successfully in health and educational contexts. Examples of the implementation of design thinking principles in health include the development of a Nurse Knowledge Exchange system (McCreary 2010) and the development of VAccApp, an evidence-based digital vaccination record that assists in keeping track of immunisations and reminders for booster immunisations (Seeber et al. 2015). Examples of the implementation of design thinking in the higher education context include programmes in design specialisations (Melles, Howard, and Thompson-Whitesidec 2012; Orthel 2015); teacher education (Henriksen, Richardson, and Mehta 2017); nursing education (Beaird, Geist, and Lewis 2018); medical education (Niccum et al. 2017; Trowbridge, Chen, and Gregor 2018); and interprofessional education (Cahn et al. 2016; Jiang et al. 2017; Van de Grift and Kroeze 2016).

Assessment of participation in design thinking projects

Hendricks et al. (2018) proposed an assessment framework that uses health assessment tools to assess participation in design thinking in the healthcare context. This framework utilises semi-structured interviews with questions that are specific to each phase of the design thinking process. The questions elicit information about behavioural aspects of engagement. In addition, each stakeholder rates their participation and that of the other stakeholders using a 5-point visual analogue rating scale (see Figure 1). Similar to the continuum used by Rifkin et al. (1988), a rating of one describes narrow participation, while a rating of five describes broad participation.

Framing the assessment of student engagement

To assess how students contribute during learning activities, it is essential to understand the boundaries of the concept of assessment. Assessment can be defined as a "systematic basis for making inferences about the learning and development of students ... the process of defining, selecting, designing, collecting, analysing, interpreting, and using [the] information to increase students' learning and development" (Erwin 1991, 15). It is also important to understand the difference between participation and engagement. Participation can be defined as taking part or contributing to something, and it may be formal in the case of a formal meeting, informal in the case of a casual discussion, and performed individually or in a group (Barki and Hartwick 1994). On the other hand, engagement refers to how involved or interested one is in an experience beyond the mechanisms of contribution. It considers the physical and psychological energy invested and how this influences learning experiences (Axelson and Flick 2010).

To fully understand student engagement, a holistic view that considers the process and the outcome of students' engagement in learning activities is required (Kahu 2013). A holistic view would integrate different domains of engagement in a single assessment. To achieve this, Kahu (2013) proposes using in-depth qualitative methods in addition to quantitative measures. The assessment itself should be framed around several interrelated aspects. These include behavioural, emotional, and cognitive engagements (Sharan and Tan 2008, 41).

Several instruments have been developed to assess student engagement. These include the Survey of Student Engagement, which is based on the National Survey of Student Engagement (NSSE), which was developed to evaluate student engagement in problem-based learning classes across varied levels of study in higher education (Ahlfeldt, Mehta, and Sellnow 2005); the Class-Level Survey of Student Engagement (CLASSE), which was developed in response to discrepancies in results of the NSSE assesses elements of physical contribution, cognitive and emotional engagement at course level (Ouimet and Smallwood 2005); and the Student Course Engagement Questionnaire (SCEQ) which was developed to assess student engagement at course level, and evaluates aspects of physical participation, emotional and cognitive engagement (Handelsman et al. 2005). All three instruments were developed at higher education institutions in the United States of America.

Assessment of student engagement in design thinking projects for educational activities

While the instruments mentioned above provide a theoretical foundation to assess engagement, they are not suitable to assess engagement in participatory projects such as design thinking as they do not consider the phasic requirements that may have different physical, emotional, and cognitive inputs. Although the participation assessment framework for design thinking proposed by Hendricks et al. (2018) is specifically designed for design thinking projects, it is also unsuitable for student engagement as its primary focus is on elements of participation or behavioural engagement.

As the application of design thinking for pedagogy increases, it will become increasingly imperative to assess students' engagement. Several academic institutions have applied design thinking for learning and research practice; examples include the University of Amsterdam (Van de Grift and Kroeze 2016), the University of Florida (Carmel-Gilfilen and Portillo 2016), Shanghai University (Jiang et al. 2017), and the University of Cape Town (Saidi et al. 2020; Conrad et al. 2019). Apart from the University of California's Public Health Innovations course, which administers an end-of-course student evaluation survey in all the courses offered in the School of Public Health (Sandhu, Hosang, and Madsen 2015), no other assessments for design thinking at course level have been identified. While this end-of-course student evaluation survey consists of several questions that allude to cognitive engagement, it does not assess students' engagement with communities in the context of design thinking.

The absence of an assessment framework for holistically assessing students' engagement, i.e., behavioural, emotional, and cognitive engagement, poses a challenge for the application of design thinking in educational settings as the engagement itself cannot be measured and thus improved. This article describes how we developed an instrument to holistically assess students' engagement in the physical, emotional, and cognitive domains in a course that uses a design thinking approach to generate innovative health solutions.

 

DEVELOPING THE ASSESSMENT INSTRUMENT

Guidelines for survey development for medical education researchers proposed by Gehlbach, Artino Jr, and Durning (2010) were applied. The seven-step design process was initially proposed in response to limited guidance for developing surveys. While the current study complies with these steps in principle, there are several key differences illustrated in Figure 2. These differences are related to the iterative nature of design thinking and other elements such as the educational requirements and timing of access to participants.

The development of the initial instrument was based on previous literature (see Table 1). Question items were then grouped according to the IDEO design thinking method (Brown and Wyatt 2010). The initial question items and a description of the literature informing each item are presented in Table 1.

In addition to items in Table 1, a rating scale was used to assess the perceived level of participation. Participants rated their level of participation, and that of their colleagues, after each phase of the project:

Please complete the scale to rate your participation throughout the [ ] phase

1 - Very Low (I did not participate in any decision-making)

2 - Low (I participated in a few decision-making activities)

3 - Moderate (Decision-making was partly shared; group leader had the final say)

4 - High (I participated in most of the decision-making activities)

5 - Very High (I participated in all the decision-making activities).

Please complete this scale to rate the participation of your group members throughout the [ ] phase.

1 - Very Low (My group members did not participate in any decision-making)

2 - Low (My group members participated in a few decision-making activities)

3 - Moderate (Decision-making power was partly shared; group leader had the final say)

4 - High (My group members participated in most of the decision-making activities)

5 - Very High (We all participated in all the decision-making activities).

Piloting initial assessment instrument

The initial assessment instrument was piloted with a purposive sample of Master's students registered for the Health Innovation and Design (HID) course, which forms part of two Master's degrees offered by the University of Cape Town - MPhil in Health Innovation and MSc in Biomedical Engineering. The course lecturer and facilitators were also recruited to take part in the study.

The students (n = 6) completed the initial assessment instrument. Students' written reflective reports (n = 5) submitted in partial fulfilment of the requirements of the HID course were analysed to (1) identify the domains of physical, emotional, and cognitive engagement congruent with the question items, and (2) identify gaps in the assessment instrument. Students' understanding of the nature of the question items and the ease with which they answered each item (Babbie 2014, 273) were also evaluated. Thereafter, a focus group discussion (n = 6) was conducted.

Additionally, the course lecturer and facilitators (n = 3), who are considered design thinking practitioners and represent "experts" within Gehlbach et al.'s (2010) survey development guide, validated an iteration of the initial assessment instrument.

Design thinking challenge

The HID course "trains students in the design thinking approach towards devising sustainable solutions to enhance health and well-being in the South African context" (Van der Westhuizen et al. 2020). The design thinking challenge for the 2019 cohort was formulated with the Health Manager at a residence for elderly citizens in Cape Town, South Africa - the challenge provided a real world concurrent design experience for students, on which we based our assessment instrument development. Sixteen students who registered for the course were separated into three groups, each addressing the challenge in its own way. The challenge ran for two months, and during this period, students attended lectures and completed activities following the design thinking methodology. These activities included engagement with various stakeholders outside the university community.

Ethical considerations

The Human Research Ethics Council of the Faculty of Health Sciences, University of Cape Town, approved this study (HREC REF: 247/2019). In addition, the Department of Student Affairs, University of Cape Town, approved access to students for the research project. Participants were provided with information regarding study purpose, participation, right to withdraw, confidentiality, risks, and benefits of participation, as well as procedures of the study and provision of an electronic informed consent before participation.

 

REFINING THE ASSESSMENT INSTRUMENT

The data collection tools applied in the present study were used for triangulation, towards the development and validation of the assessment instrument. Triangulation is the use of multiple data sources about a single phenomenon to provide a comprehensive understanding of the phenomenon under examination (Carter et al. 2014). It can be used to cross-reference aspects of a phenomenon that may need to change or improve (Ouimet et al. 2004). In the present study, triangulation was used to cross-reference engagement domains tested in the various data sources, to aid in development and validation of the final assessment instrument.

Initial instrument administration and analysis

Once the initial assessment instrument had been devised, it was administered to students in the HID course. The analysis of the responses to the initial question items sought to identify relevant information that the assessment instrument aimed to elicit as well as the consistency in the series of answers. Content analysis was applied to identify the relevant information that relates to the respective domains of engagement. Content analysis is "the study of recorded human communications" (Babbie 2014, 341), including written communication. Of interest was the presence of words, terms, or phrases related to any of the engagement domains being tested. Difficulties identified in the question item responses (i.e., any misunderstandings and answers related to a part of the engagement domain that was not being tested in any of the items) were noted for the focus group discussion.

Analysis of the written reflective reports

The written reflective reports of the HID course are intended to be a comprehensive evaluation of the students' insights gained regarding the design challenge, the design process, and the final solution. The report consists of a reflective component on the practical learning experience in the course; therefore, it may contain information relating to all aspects of engagement.

Similar to the analysis of the narrative question items, content analysis was applied to the written reports to identify physical, emotional, and cognitive domains of engagement. According to a conceptual framework, coding was used to process the raw data into a standard format (Babbie 2014, 346). In addition, any relevant information that was evident in the written reports but had not been considered in the initial assessment instrument development was noted as a recommendation for the final iteration of the instrument.

Focus group discussion

Several aspects of the assessment instrument were discussed. These included the instrument headings and/or lack of instructions; the timing of administration of the instrument within the design thinking challenge; specificity of the terms used in the question items; separation of instrument sections according to the phase of a design thinking project; participants' conceptualisation of the question items, and their perception of the relevance and clarity of the items.

The focus group discussion further allowed participants to elaborate and explain how they approached certain items and whether they faced any difficulties answering the items (Beatty and Willis 2007). Problems noted in the question item responses and unclear information from the written reports were raised for discussion. Data from the focus group discussion were transcribed, and recommendations were heeded to revise the initial assessment instrument. This revision encapsulated an analysis of the initial question item responses, the written reflective reports, and the focus group discussion.

Practitioner validation

Following a revision of the initial assessment instrument, practitioner validation was conducted. To do this, a questionnaire (see Table 2) was devised and administered to the design thinking practitioners. In this questionnaire, the practitioners were requested to rate their level of agreement that the collection of items in each section/phase fully encompassed the domains of engagement that the instrument aimed to assess. Similar to Visser et al. (2020) in their questionnaire development and preliminary evaluation, a 5-point Likert scale was used. To collect explanatory data supporting the practitioners' views, each rated item was followed by a narrative item in which practitioners could suggest and/or motivate any further revisions. A courtesy note defining the aspects of engagement, according to Sharan and Tan (2008, 41), was included in the instructions section of the questionnaire.

To promote methodological rigour in the validation process, and in response to the small sample size of fewer than five practitioners, all practitioners had to agree on the content validity of the assessment instrument (Lynn 1986). None of the design thinking practitioners who participated in this part of the study were involved in designing and developing the initial question items of the assessment instrument.

Like Visser et al. (2020), ratings of 3 to 5 were considered valid, in which case, an item would remain in the assessment instrument. In the case of a low rating, i.e., 1 and 2, a decision was made by the research team to keep, remove, or rephrase the item. Such a decision considered the narrative explanation provided by the practitioner/s.

Summary of changes made to the initial assessment instrument layout

Despite the questions for each phase of the proj ect being sectioned in the initial instrument, it was recommended that each section should have a description detailing explicitly which phase the items relate to: "Something that might help is to put on the headings ... maybe next to it, answer all of these questions [as they relate to] the activities you undertook during [each phase]." (Participant 2).

In addition to the descriptions clarifying the information required for each phase, participants agreed that each phase must have a separate instrument. This would help with the accuracy and detail of the information recall: "Because sometimes you don't even remember what was in that phase." (Participant 1).

To revise the assessment instrument layout, these recommendations were heeded. Each phase of the instrument features an instruction informing the respondent that items in that section relate only to the respective phase. Additionally, each phase should be a separate assessment instrument completed in isolation soon after much of the activities for that phase have been completed.

Summary of changes made to the initial question items of the assessment instrument

Several changes were made to the question items following triangulation of the data collection tools. These include:

Replacing words and terms; e.g., the term "role" in question item 3 and all others in which it appears, was replaced with "contribution" to avoid ambiguity.

The difficulty was the ambiguity of the word "role". For example, a participant said the following,

"Are you referring to like contribution or are you referring to [a] role within what is [an] engagement or what is expected of me or like which part of that phase was I responsible for?" (Participant 2).

Additionally, another participant said, "I found it hard to answer because we never thought of it as a role ..." (Participant 4).

An example alternative to the ambiguity of the word "role" was to rephrase the question to "... how do you feel you contributed in this phase?" (Participant 4).

Adding words and terms; e.g., in question item 19.

Although there were no difficulties understanding or providing an answer to this item, participants agreed that the term "interactive" is a descriptive synonym for "participatory", and that the term participatory could be added.

Adding entirely new question items; e.g., a Likert scale item on negative emotions experienced and a narrative explanation of the effect of those emotions.

These items were added to each phase of the assessment instrument. The addition is a result of the absence of an item/s in the initial instrument to test for negative emotions. These emotions were evident in the written reflective reports. Specific negative emotions identified in the written reflective reports coincide with the process (e.g., boredom), prospective (e.g., being frightened) and retrospective (e.g., disheartenment) emotions of the task-related and self-related domains of academic emotions (see Pekrun et al. 2002).

Separating questions into multiple items; e.g., item 4, to assess in class and out of class engagement.

There were several concerns with the wording and broadness of item 4. Participants were unsure of the meaning of "participatory methods" and "data collection." In addition, they were uncertain of the context to which the question referred; for example,

"I think if you wanted a more particular answer, let's say about the interviews [you could say] I enjoyed the participatory methods of like in class or during group work or during teamwork ... is it in the environment of when we as a group worked together or is it in the environment when we actually go out and retrieve information or when we work on that specific information?" (Participant 2).

Burch et al. (2015) propose that students can be cognitively engaged "in class" and "out of class". Since students in the HID course interact with one another in class and with community members out of class, it is plausible that they may be emotionally engaged in class and out of class - it is therefore crucial that both these contexts are assessed.

A revision to separate question item 4 into two question items to assess enjoyment in class and out of class was made.

Removing words and terms; e.g., the term "group leader" was removed from all rating scales as participants were uncertain to whom the term referred. The difficulty was the ambiguity of the term as it could mean the course lecturer, facilitators, or group leader if the group had elected one.

Additionally, one of the design thinking practitioners cautioned the use of the term "novel solution" in question item 19, as: "... a first iteration of design thinking will probably not yield a fully functional solution." (Practitioner 3). The term "novel" was thus removed in the revised assessment instrument.

None of the question items in the initial assessment instrument were removed.

Final revised assessment instrument

A final revised instrument to assess student engagement in design thinking projects is presented in the Table 3. This revision encapsulates the analysis of the initial question item responses, students' written reflective reports, the focus group discussion, and design thinking practitioner validation.

 

CONCLUSION

The aim of this study was to develop an instrument that holistically assesses student engagement when design thinking is applied for pedagogical purposes. To achieve this, the survey development guide by Gehlbach et al. (2010) was used wherein several data collection tools -an initial assessment instrument based on previous literature on student engagement and design thinking, students' written reflective reports, a focus group discussion, and practitioner validation - were triangulated to develop and validate the instrument.

Our contribution to student engagement research is an instrument to assess student engagement in design thinking projects for health innovation. The assessment instrument is presented in Table 3. It is theoretically grounded and integrates the concept of design thinking to that of student engagement.

Our assessment instrument builds on the work of Hendricks et al. (2018), which provides a means to assess participation (i.e., physical engagement) in design thinking projects for health innovation. We add emotional and cognitive engagement domains - collectively these domains are congruent with the work of Burch et al. (2015), which provides a means to assess student engagement in and out of the classroom. Our assessment instrument reflects this integration wherein the phases of design thinking, according to the IDEO design thinking approach, are clearly separated and all the domains of student engagement are integrated in each phase. The assessment instrument can elucidate engagement among students in and out of the classroom during design thinking projects and contribute to strategies that aim to improve engagement in the community.

Our assessment instrument is also holistic as it integrates qualitative and quantitative measures i.e., narrative question items, closed items, Likert scale items, and rating scales to obtain a total measure of student engagement. The integration of various types of measures advances research in assessment of student engagement, as instruments to assess student engagement typically apply quantitative measures. For example, Hopper (2016) recently applied an instrument developed by Ahlfeldt et al. (2005) to assess student engagement in various physiology courses at university level. This instrument, which is based on the NSSE, applies quantitative measures.

Finally, we provide a means to assess all the domains of student engagement in a design thinking project. Our measure of student engagement in design thinking projects for health innovation is a first of its kind. This measure separates the design thinking phases and integrates all the domains of student engagement. The assessment instrument was validated by design thinking practitioners.

The assessment instrument may be used to assess engagement in academic settings as well as non-academic settings when design thinking is applied for health innovation. In both these settings, reflective journals could be integrated and assessed to further identify and understand the phenomenon of (student) engagement.

 

LIMITATIONS

A small sample of the HID course cohort participated in the study. A small sample size limited quantitative analyses and precluded the use of statistics such as Cronbach's alpha, which is commonly used to test internal consistency when developing similar assessment tools. However, triangulation of data and practitioner validation strengthen confidence in the output of the study.

Future research should investigate the psychometric qualities of the assessment instrument in the present study with larger sample sizes and with multi-site design thinking challenges.

 

REFERENCES

Ahlfeldt, Stephanie, Sudhir Mehta, and Timothy Sellnow. 2005. "Measurement and Analysis of Student Engagement in University Classes Where Varying Levels of PBL Methods of Instruction Are in Use." Higher Education Research & Development 24(1): 5-20. https://doi.org/10.1080/0729436052000318541.         [ Links ]

Arnstein, Sherry R. 1969. "A Ladder of Citizen Participation." Journal of the American Planning Association 35(4): 216-224. https://doi.org/10.1080/01944366908977225.         [ Links ]

Axelson, Rick D. and Arend Flick. 2010. "Defining Student Engagement." Change: The Magazine of Higher Learning 43(1): 38-43. https://doi.org/10.1080/00091383.2011.533096.         [ Links ]

Babbie, Earl. 2014. The Basics of Social Research. 6th Edition. Wadsworth: Cengage Learning.         [ Links ]

Barki, Henri and Jon Hartwick. 1994. "Measuring User Participation, User Involvement, and User Attitude." MIS Quarterly 18(1): 59-82. https://doi.org/10.2307/249610.         [ Links ]

Beaird, Genevieve, Melissa Geist, and Erica J. Lewis. 2018. "Design Thinking: Opportunities for Application in Nursing Education." Nurse Education Today 64: 115-118. https://doi.org/10.1016/j.nedt.2018.02.007.         [ Links ]

Beatty, Paul C. and Gordon B. Willis. 2007. "Research Synthesis: The Practice of Cognitive Interviewing." Public Opinion Quarterly 71(2): 287-311. https://doi.org/10.1093/poq/nfm006.         [ Links ]

Brown, Tim and Jocelyn Wyatt. 2010. "Design Thinking for Social Innovation." Stanford Social Innovation Review 8(1): 30-35.         [ Links ]

Burch, Gerald F., Nathan A. Heller, Jana J. Burch, Rusty Freed, and Steve A. Steed. 2015. "Student Engagement: Developing a Conceptual Framework and Survey Instrument." Journal of Education for Business 90(4): 224-229. https://doi.org/10.1080/08832323.2015.1019821.         [ Links ]

Cahn, Peter S., Andrew Bzowyckyj, Lauren Collins, Alan Dow, Kristen Goodell, Alex F. Johnson, David Klocko, et al. 2016. "A Design Thinking Approach to Evaluating Interprofessional Education." Journal of Interprofessional Care 30(3): 378-380. https://doi.org/10.3109/13561820.2015.1122582.         [ Links ]

Carmel-Gilfilen, Candy and Margaret Portillo. 2016. "Designing with Empathy: Humanizing Narratives for Inspired Healthcare Experiences." Health Environments Research and Design Journal 9(2): 130-146. https://doi.org/10.1177/1937586715592633.         [ Links ]

Carter, Nancy, Denise Bryant-Lukosius, Alba Dicenso, Jennifer Blythe, and Alan J. Neville. 2014. "The Use of Triangulation in Qualitative Research." Oncology Nursing Forum 41(5): 545-547. https://doi.org/10.1188/14.ONF.545-547.         [ Links ]

Conrad, Nailah, Tinashe E. M. Mutsvangwa, Anastasia Doyle, Trust Saidi, and Tania S. Douglas. 2019. "User-Centred Design in a Health Innovation Course to Address Hearing Loss in the Elderly." In Biomedical Engineering for Africa, ed. Tania S. Douglas, 65-75. Cape Town: University of Cape Town Libraries. https://doi.org/10.15641/0-7992-2544-0.         [ Links ]

Doorley, Scott, Sarah Holcomb, Perry Klebahn, Kathryn Segovia, and Jeremy Utley. 2018. "Design Thinking Bootleg." https://dschool.stanford.edu/resources/design-thinking-bootleg. (Accessed 7 October 2022).         [ Links ]

Draper, Alizon K., Gillian Hewitt, and Susan Rifkin. 2010. "Chasing the Dragon: Developing Indicators for the Assessment of Community Participation in Health Programmes." Social Science and Medicine 71(6): 1102-1109. https://doi.org/10.1016/j.socscimed.2010.05.016.         [ Links ]

Erwin, T. D. 1991. Assessing Student Learning and Development: A Guide to the Principles, Goals, and Methods of Determining College Outcomes. San Francisco: Jossey-Bass Inc.         [ Links ]

Gehlbach, Hunter, Anthony R. Artino Jr., and Steven J. Durning. 2010. "AM Last Page: Survey Development Guidance for Medical Education Researchers." Academic Medicine 85(5): 925. https://doi.org/10.1097/ACM.0b013e3181dd3e88.         [ Links ]

Goodman, Melody S., Vetta L. Sanders Thompson, Cassandra A. Johnson, Renee Gennarelli, Bettina F. Drake, Pravleen Bajwa, Maranda Witherspoon, and Deborah Bowen. 2017. "Evaluating Community Engagement in Research: Quantitative Measure Development." Journal of Community Psychology 40(1): 17-32. https://doi.org/10.1002/jcop.21828.         [ Links ]

Greene, Barbara A. and Raymond B. Miller. 1996. "Influences on Achievement: Goals, Perceived Ability, and Cognitive Engagement." Contemporary Educational Psychology 21(2): 181-192. https://doi.org/10.1006/ceps.1996.0015.         [ Links ]

Hake, Richard R. 1998. "Interactive-Engagement versus Traditional Methods: A Six-Thousand-Student Survey of Mechanics Test Data for Introductory Physics Courses." American Journal of Physics 66(1): 64-74. https://doi.org/10.1119/1.18809.         [ Links ]

Handelsman, Mitchell M., William L. Briggs, Nora Sullivan, and Annette Towler. 2005. "A Measure of College Student Course Engagement." Journal of Educational Research 98(3): 184-192. https://doi.org/10.3200/JOER.98.3.184-192.         [ Links ]

Hendricks, Sharief, Nailah Conrad, Tania S. Douglas, and Tinashe Mutsvangwa. 2018. "A Modified Stakeholder Participation Assessment Framework for Design Thinking in Health Innovation." Healthcare 6(3): 191-196. https://doi.org/10.1016/j.hjdsi.2018.06.003.         [ Links ]

Henriksen, Danah, Carmen Richardson, and Rohit Mehta. 2017. "Design Thinking: A Creative Approach to Educational Problems of Practice." Thinking Skills and Creativity 26: 140-153. https://doi.org/10.1016/j.tsc.2017.10.001.         [ Links ]

Hopper, Mari K. 2016. "Assessment and Comparison of Student Engagement in a Variety of Physiology Courses." Advances in Physiology Education 40(1): 70-78. https://doi.org/10.1152/advan.00129.2015.         [ Links ]

Jiang, Jiehui, Tingwei Liu, Yuting Zhang, Yu Song, Mi Zhou, Xiaosong Zheng, and Zhuangzhi Yan. 2017. "Design and Development of an Intelligent Nursing Bed - A Pilot Project of 'Joint Assignment'." In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Jeju, South Korea, July 11-15, 2017. https://doi.org/10.1109/EMBC.2017.8036757.         [ Links ]

Kahu, Ella R. 2013. "Framing Student Engagement in Higher Education." Studies in Higher Education 38(5): 758-73. https://doi.org/10.1080/03075079.2011.598505.         [ Links ]

Lynn, Mary R. 1986. "Determination and Quantification of Content Validity." Nursing Research 35(6): 382-385. https://doi.org/10.1097/00006199-198611000-00017.         [ Links ]

McCreary, Lew. 2010. "Kaiser Permanenete's Innovation on the Front Lines." https://hbr.org/2010/09/kaiser-permanentes-innovation-on-the-front-lines. (Accessed 7 October2022).         [ Links ]

McLaughlin, Jacqueline E., Michael D. Wolcott, Devin Hubbard, Kelly Umstead, and Traci R. Rider. 2019. "A Qualitative Review of the Design Thinking Framework in Health Professions Education." BMC Medical Education 19(98). https://doi.org/10.1186/s12909-019-1528-8.         [ Links ]

Melles, Gavin, Zaana Howard, and Scott Thompson-Whitesidec. 2012. "Teaching Design Thinking: Expanding Horizons in Design Education." Procedia - Social and Behavioral Sciences 31: 162-166. https://doi.org/10.1016/j.sbspro.2011.12.035.         [ Links ]

Niccum, Blake A., Arnab Sarker, Stephen J. Wolf, and Matthew J. Trowbridge. 2017. "Innovation and Entrepreneurship Programs in US Medical Education: A Landscape Review and Thematic Analysis." Medical Education Online 22(1). https://doi.org/10.1080/10872981.2017.1360722.         [ Links ]

Orthel, Bryan D. 2015. "Implications of Design Thinking for Teaching, Learning, and Inquiry." Journal of Interior Design 40(3): 1-20. https://doi.org/10.1111/joid.12046.         [ Links ]

Ouimet, Judith A., and Robert A. Smallwood. 2005. "Assessment Measures: CLASSE - The Class-Level Survey of Student Engagement." Assessment Update 17(6): 13-15.         [ Links ]

Ouimet, Judith A., Joanne C. Bunnage, Robert M. Carini, George D. Kuh, and John Kennedy. 2004. "Using Focus Groups, Expert Advice, and Cognitive Interviews to Establish the Validity of a College Student Survey." Research in Higher Education 45(3): 233-250. https://doi.org/10.1023/B:RIHE.0000019588.05470.78.         [ Links ]

Pekrun, Reinhard, Thomas Goetz, Wolfram Titz, and Raymond P. Perry. 2002. "Academic Emotions in Students' Self-Regulated Learning and Achievement: A Program of Qualitative and Quantitative Research." Educational Psychologist 37(2): 91-105. https://doi.org/10.1207/S15326985EP3702.         [ Links ]

Pekrun, Reinhard. 2006. "The Control-Value Theory of Achievement Emotions: Assumptions, Corollaries, and Implications for Educational Research and Practice." Educational Psychology Review 18(4): 315-341. https://doi.org/10.1007/s10648-006-9029-9.         [ Links ]

Rifkin, Susan B., Frits Muller, and Wolfgang Bichmann. 1988. "Primary Health Care: On Measuring Participation." Social Science and Medicine 26(9): 931-940. https://doi.org/10.1016/0277-9536(88)90413-3.         [ Links ]

Roberts, Jess P., Thomas R. Fisher, Matthew J. Trowbridge, and Christine Bent. 2016. "A Design Thinking Framework for Healthcare Management and Innovation." Healthcare 4(1): 11-14. https://doi.org/10.1016/j.hjdsi.2015.12.002.         [ Links ]

Saidi, Trust, Donné van der Westhuizen, Nailah Conrad, Tinashe Mutsvangwa, and Tania S. Douglas. 2020. "Learning by Solving as a Pedagogical Approach to Inclusive Health Innovation." Development Southern Africa 37(3): 418-31. https://doi.org/10.1080/0376835X.2019.1640662.         [ Links ]

Sandhu, Jaspal S., Robert Hosang, and Kristine A. Madsen. 2015. "Solutions That Stick: Activating Cross-disciplinary Collaboration in a Graduate-level Public Health Innovations Course at the University of California, Berkeley." American Journal of Public Health 105(S1): S73-77. https://doi.org/10.2105/AJPH.2014.302395.         [ Links ]

Seeber, Lea, Bettina Michl, Gabriella Rundblad, Brett Trusko, Maxim Schnjakin, Christoph Meinel, Ulrich Weinberg, Gerhard Gaedicke, and Barbara Rath. 2015. "A Design Thinking Approach to Effective Vaccine Safety Communication." Current Drug Safety 10(1): 31-40. https://doi.org/10.2174/157488631001150407105400.         [ Links ]

Sharan, Shlomo and Ivy G. C. Tan. 2008. "Student Engagement in Learning." In Organizing Schools for Productive Learning, 41-45. Dordrecht: Springer. https://doi.org/10.1007/978-1-4020-8395-2_3.         [ Links ]

Smiley, Whitney and Robin Anderson. 2011. "Measuring Students' Cognitive Engagement on Assessment Tests: A Confirmatory Factor Analysis of the Short Form of the Cognitive Engagement Scale." Research & Practice in Assessment 6: 17-28.         [ Links ]

Trowbridge, Matthew, David Chen, and Alex Gregor. 2018. "Teaching Design Thinking to Medical Students." Medical Education 52(11): 1199-1200. https://doi.org/10.1111/medu.13699.         [ Links ]

Van de Grift, Tim C. and Renske Kroeze. 2016. "Design Thinking as a Tool for Interdisciplinary Education in Health Care." Academic Medicine 91(9): 1234-1238. https://doi.org/10.1097/ACM.0000000000001195.         [ Links ]

Van der Westhuizen, Donné, Nailah Conrad, Tania S. Douglas, and Tinashe Mutsvangwa. 2020. "Engaging Communities on Health Innovation: Experiences in Implementing Design Thinking." International Quarterly of Community Health Education 41(1): 101-114. https://doi.org/10.1177/0272684X19900880.         [ Links ]

Visser, Katharina M., Kirsten van Gink, Floor Thissen, Tessa A. Visser, Tormod Rimehaug, Lucres C. M. Jansen, and Arne Popma. 2020. "Development and Preliminary Evaluation of the Reaction to Unacceptable Behavior Inventory: A Questionnaire to Measure Progress in Implementation of Non-Violent Resistance." Child & Youth Care Forum 49: 43-57. https://doi.org/10.1007/s10566-019-09517-5.         [ Links ]

Creative Commons License Todo o conteúdo deste periódico, exceto onde está identificado, está licenciado sob uma Licença Creative Commons