Workshops and Masterclass


Workshops and Masterclass Overview

Please note: Program times shown are Christchurch, Canterbury Region, New Zealand (NZST). The program overview below is preliminary and subject to change.

Please note, attendance at Workshops and Masterclass is an additional cost to your registration fee:

  • Workshops - $120.00 per person, per workshop
  • Masterclass - $255.00 per person, per masterclass 

If you have already registered for the conference, you can log back onto your registration to add on a Workshop or Masterclass by clicking the link in your registration confirmation email. Alternatively, please email the Conference Manager via ANZAHPE2026@eventstudio.com.au and advise in the email what Workshops/Masterclasses you would like to add to your registration.

Please note, each capacity is limited for each Workshop and Masterclass so we encourage you to book now to avoid disappointment! 


Workshops


Workshop 1: Being the Human in the Loop: Testing and Evaluating AI-Based Tools in Health Professions Education

Monday 29 June 2026
Location: Bealey B2, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenters: Prof Adrienne Torda and A/Prof Betty Chan
UNSW

Introduction/Background

Artificial intelligence (AI) is rapidly transforming health professions education, offering tools that personalise learning, automate feedback and support assessment. The concept of “being the human in the loop” underscores educators’ essential role in shaping, monitoring and interpreting AI outputs to ensure technology serves academic and educational goals, not the reverse. Although, even this is controversial. This interactive pre-conference workshop will equip educators and researchers with practical frameworks for exploring, testing and evaluating AI-based tools in their own programs and institutions. Drawing on examples, participants will learn structured approaches to comparing AI tools across parameters such as purpose, data sources, transparency, learner feedback, academic feedback, utility and integrations, time effectiveness and curriculum alignment. They will learn to apply scholarly approaches to using and testing AI tools in their educational practice.

Methods / Outline of workshop activities

This workshop will combine demonstration, collaborative analysis and design-thinking activities:
1. Framing presentation: overview of the “human in the loop” concept and current examples of AI in education and some examples of how to evaluate.

2. Matrix creation: Explore and define parameters that are important to evaluate.

3. Group exploration: participants review AI tool comparison matrices to identify educational benefits, risks and limitations.

4. Design challenge: each group creates a short plan for an “AI testing mini-project,” defining the educational problem, evaluation method, and ethical or governance considerations.

5. Peer feedback and reflection: participants refine their designs and identify next steps for implementation or research.

Discussion

By the end of the workshop, participants will be able to:

1. Explain the “human in the loop” concept and its relevance to responsible AI use.

2. Think about the parameters that they want to test and consider how best to approach this.

3. Apply structured frameworks to compare and evaluate AI tools.

4. Design a feasible, ethical AI testing project for their local context.

5. Identify institutional supports and collaborations for AI-enabled education.

References

1. Tolsgaard MG, Pusic MV, Sebok-Syer SS, Gin B, Svendsen MB, Syer MD, Brydges R, Cuddy MM, Boscardin CK. The fundamentals of Artificial Intelligence in medical education research: AMEE Guide No. 156. Med Teach. 2023 Jun; 45(6):565-573. doi: 10.1080/0142159X.2023.2180340

2. Ahsan, Z. Integrating artificial intelligence into medical education: a narrative systematic review of current applications, challenges, and future directions. BMC Med Educ 25, 1187 (2025). https://doi.org/10.1186/s12909-025-07744-0


Workshop 2: Strengthening Connections between Curriculum Developers through Critical Team Reflection

Monday 29 June 2026
Location: Bealey B3, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenters: Dr Julia Paxino, Ms Grecia Alaniz, Dr Victoria Boyd, A/Prof Stella Ng, Prof Joanne Bolton, Ms Carolyn Cracknell
University of Melbourne and University of Toronto

Introduction/Background

Interprofessional education (IPE) is essential for preparing healthcare students for collaborative practice. It is especially important in mental health where care commonly involves diverse professionals, complex psychosocial needs and a high risk of stigma and power imbalances. Much attention has focused on learning outcomes and instructional strategies, but less on how interprofessional educators collaborate to design and implement IPE. This is particularly consequential in the mental health context, where unexamined assumptions within teaching teams can inadvertently reinforce power imbalances, stigma, and unsafe practices. This workshop applies critical team reflection as a framework to support IPE curriculum developers and teaching teams to develop routines for collective reflection so their work models, rather than undermines, reflective practice and collaboration.

Methods

This workshop introduces critical team reflection as a framework to guide curriculum design in IPE. This framework integrates concepts from critical reflection, critical dialogue and team reflexivity to help curriculum teams surface assumptions, address hierarchies and improve safety in IPE. Through short presentations, case-based small-group work and a guided strategy activity, participants will practise routines for collaboratively surfacing assumptions, addressing hierarchies and embedding reflective habits within curriculum design, implementation and evaluation.

Results/Evaluation

By the end of the workshop, participants will be able to:

1. Explain the concept and benefits of critical team reflection in IPE curriculum development.

2. Identify key practices for enacting critical team reflection within interprofessional curriculum teams.

3. Implement strategies to promote critical team reflection in interprofessional curriculum development teams.

Discussion

The learnings generated in this workshop will serve as a foundation for curriculum development teams to engage in critical team reflection. This will help teams improve the design and implementation of complex, multi-stakeholder curricula. In turn, this reflective practice reduces stigma and bias, enhances psychological safety, and supports educators to model collaborative practice for learners.

References

Boyd, V. A., Woods, N. N., Kumagai, A. K., Kawamura, A. A., Orsino, A., & Ng, S. L. (2022). Examining the impact of dialogic learning on critically reflective practice. Academic Medicine, 97(11S), S71-S79.

Schmutz, J. B., & Eppich, W. J. (2017). Promoting learning and patient care through shared reflection: a conceptual framework for team reflexivity in health care. Academic Medicine, 92(11), 1555-1563.

Outline of workshop activities

1. Welcome and Introduction (10mins)

  • Overview of objectives and relevance to healthcare education.
  • Define critical team reflection and its role in curriculum development.

2. Whole-group discussion (10mins)

  • Prompt: “What could critical team reflection look like in practice?”

3. Overview of critical team reflection (20mins)

  • Key concepts and summary of evidence: Critical Reflection, Critical dialogue and Team Reflexivity.

4. Small-group work (20mins)

  • Scenario analysis and discussion
  • Case: Interprofessional educational team developing a mental health simulation scenario.
  • Small groups apply critical team reflection to surface assumptions and identify risks.

5. Strategy Development (20mins)

  • Guided activity to design actionable critical team reflection strategies to address barriers (e.g., biases, hierarchies and time limitations).
  • Groups present one strategy to the larger audience.

6. Reflections (10mins)

  • Individual/partner commitment to one critical team reflection action; signpost resources


Workshop 3: From Principles to Practice: Implementing Universal Design for Learning in Nursing Education

Monday 29 June 2026
Location: Bealey B4, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenter: Mrs Rebecca Caulfield
Murdoch University

Introduction/Background

Undergraduate nursing cohorts are increasingly diverse in how students learn, engage, and demonstrate capability. While Universal Design for Learning (UDL) is often discussed as a theoretical ideal, educators frequently report uncertainty about how to translate UDL principles into practical, scalable teaching strategies. This session explores how UDL can be intentionally integrated into an undergraduate nursing program to embrace and position learner diversity not as a problem to be managed, but as a resource for learning.

Methods

Drawing on UDL principles, a series of interactive learning strategies were co-designed and embedded across a clinical nursing unit. These strategies focused on multiple means of engagement, representation, and action, while explicitly foregrounding students’ existing strengths, experiences, and ways of knowing. This session mirrors this design approach, engaging participants in interactive and reflective tasks, live decision-making activities, and collaborative problem-solving to surface assumptions about “the average learner” and explore alternative design choices in real time.

Results/Evaluation

Preliminary evaluation data from student feedback, learning artefacts, and educator reflections suggest increased learner engagement, improved confidence in clinical reasoning, and greater accessibility for neurodivergent and non-traditional students. Educators reported enhanced clarity in aligning learning outcomes, assessment, and teaching strategies, alongside reduced reliance on reactive accommodations.

Discussion

This session argues that integrating UDL through a strengths-based lens shifts curriculum design from compliance to care. Participants will leave with practical design principles, transferable activities, and reflective prompts to support immediate implementation within their own teaching contexts, while critically examining how power, norms, and assumptions shape nursing education.

Outline of workshop activities

The 90-minute session will begin with a concise framing of Universal Design for Learning, outlining core principles and clarifying the focus on practical integration within health education (15 minutes). The first major interactive block will be a UDL Retrofit Sprint, where participants work in small groups to redesign a familiar lecture, tutorial, laboratory session, or assessment using UDL concepts. Emphasis will be placed on depth rather than breadth, allowing participants time to discuss design decisions, anticipate barriers, and consider feasibility within institutional constraints (30 minutes).

The session will then transition to a facilitated exploration of Lived-Experience Vignettes, using short, authentic learner scenarios to prompt discussion about accessibility, engagement, and representation. Participants will reflect on how curriculum design choices can unintentionally create barriers, and how UDL-informed strategies can mitigate these while maintaining academic standards (15 minutes).

The final interactive segment, Implementation Planning, supports participants to consolidate learning into a focused, achievable plan. Participants will identify one concrete UDL strategy to implement immediately and one longer-term design goal, with guided prompts to support practice translation (15 minutes). The session will conclude with a brief synthesis and collective reflection, reinforcing key takeaways and opportunities for continued application beyond the workshop (5 minutes).


Workshop 4: Stepping into Health Professions Education Research Supervision

Monday 29 June 2026
Location: Bealey B5, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenters: Dr Louise Allen, Dr Jess Lees, A/Prof Kelley Graydon, A/Prof Clare McNally
The University of Melbourne and Monash University

Introduction/Background

Research capstone, honours, coursework masters, and higher degrees in health professions education (HPE) play an important role in developing HPE researchers and educators to conduct quality research with the aim of advancing practice. Effective supervision is vital for quality HPE research, yet most training is not tailored to the field. Early-career and novice supervisors can therefore struggle to know how to being. This workshop addresses this gap with HPE-specific guidance.

Methods

Introduction (5 minutes)

  • Acknowledgement of Country, introduction to presenting team, learning objectives.

Poll and Reflection (15 minutes)

  • Poll to get an overview of who is in the room. Padlet to capture why they have come along, what they want to know, what they want to get out of the workshop.

Presentation (10 minutes)

  • Overview of what is unique about HPE supervision, building up supervision experience, attracting students, networking, and promoting yourself as a HPE research supervisor.

Series of Small Group Activities (60 minutes)

  • A small group discussion (value and challenges of HPE research supervision), Pair and Share (writing and feedback on a HPE project), co-developing a supervision plan in small groups.

Results/Evaluation

This workshop aims to equip early-career and novice supervisors with the foundational knowledge, practical tools and initial networks needed to confidently design, promote and begin supervising Health Professions Education (HPE) research projects. The session is designed to be a practical, interactive introduction to supervising Health Professions Education research where participants can learn what makes HPE supervision unique, craft student‑facing project blurbs, and build a short‑term supervision plan

Discussion

Participants will leave the workshop with resources, and a tangible plan for how they can best place themselves to engage in HPE research supervision. The information gathered will be used to design a follow up workshop on issues of importance to the attendees.


Workshop 5: Co-creating institutional resilience: practical tools across diverse health professions education contexts

Monday 29 June 2026
Location: Dobson 1, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenters: Prof Sonia Ijaz Haider
Dow University of Health Sciences

Introduction/Background

Resilience in health professions education is frequently framed as an individual capability, emphasising personal coping skills and self-care strategies. While these approaches may support individuals, growing scholarship highlights the central role of institutional culture, leadership practices, and organisational structures in shaping learner and faculty well-being. Across diverse educational contexts, educators report shared challenges including workload pressures, misalignment between stated values and everyday practices, and limited guidance on how to embed well-being within routine educational systems. Conceptualising resilience as an institutional property rather than an individual responsibility offers a way to integrate well-being and flourishing into everyday educational practice. This pre-conference workshop responds to the need for applied, educator-facing approaches that support participants to examine their own institutional contexts and co-create practical, context-sensitive tools for institutional resilience.

Methods

This workshop is designed as an interactive, participatory learning experience. Brief facilitator inputs introduce key concepts related to institutional resilience, followed by structured small-group activities to promote shared sense-making and contextual application. Participants engage with short case snapshots drawn from contrasting health professions education settings, guided institutional mapping exercises, and collaborative design tasks focused on identifying feasible institutional actions. The workshop prioritises peer learning and practical application rather than didactic instruction.

Results/Evaluation

As this is an educational innovation, evaluation will focus on participant engagement and perceived utility of the tools introduced. Indicators of success include participants’ ability to articulate distinctions between individual and institutional resilience, identify institutional enablers and constraints within their own contexts, and co-create at least one actionable institutional strategy. Participant reflections and action commitments will be used to assess immediate learning outcomes.

Discussion

This workshop supports participants to reframe well-being as a shared institutional responsibility rather than an individual burden. By applying an institutional resilience lens, participants are equipped with practical tools to inform local educational development initiatives, leadership conversations, and ongoing efforts to create sustainable learning cultures in health professions education.

References

Cleland J, Johnston P, Watson V. Reframing wellbeing in medical education: Moving beyond individual resilience. Medical Education. 2023.

van Vendeloo SN, Brand PLP, Verheyen CCPM. Flourishing learning environments in health professions education: A systems perspective. Advances in Health Sciences Education. 2024.

Outline of workshop activities

This 90-minute pre-conference workshop balances concise facilitator input with extended participant engagement. The session begins with an interactive introduction that establishes shared language around institutional resilience and invites participants to reflect on how resilience is currently conceptualised in their own settings. Participants then work in small groups with brief case snapshots from contrasting health professions education contexts to prompt discussion about institutional practices, culture, and structures that influence well-being.

Building on this discussion, participants engage in a guided institutional mapping exercise to identify local enablers and drains of resilience within their own educational

environments. Groups then collaborate to design one realistic, context-specific institutional action that could be implemented within existing constraints. The workshop concludes with a whole-group synthesis of key insights and an individual action commitment, ensuring participants leave with a practical framework and a clear next step for application in their own settings.


Workshop 6: Using Large Language Models to Support Qualitative Data Analysis

Monday 29 June 2026
Location: Dobson 2, Te Pae Christchurch Convention Centre
09:00 - 10:30Presenter: Dr Bruce Lister
The University of Queensland

Introduction/Background

Large language models (LLMs) are increasingly used in health professions education research, including qualitative analysis. Their rapid uptake has outpaced methodological guidance, raising concerns about epistemological misalignment, loss of reflexivity, and ethical governance. In qualitative methodologies such as reflexive thematic analysis, meaning is actively generated by the researcher rather than discovered by tools. This workshop addresses how LLMs can be used in support of, rather than in place of, reflexive analytic work.

Aims

This workshop aims to equip participants with practical, ethically robust, and methodologically defensible approaches to integrating LLMs into reflexive thematic analysis.

Learning Outcomes

By the end of the workshop, participants will be able to:

1. Distinguish appropriate from inappropriate uses of LLMs in reflexive thematic analysis

2. Apply LLMs to support theoretically guided approaches to qualitative analysis

3. Maintain analytic ownership and reflexivity when using AI tools

4. Identify ethical, governance, and transparency considerations in AI-assisted qualitative research

5. Articulate defensible AI use in methods sections and ethics applications

Workshop Structure

· Introduction and framing (10 min) Epistemological alignment of reflexive TA and AI-assisted analysis

· Conceptual foundations (15 min) What LLMs can and cannot do in qualitative research

· Demonstration (15 min) Contrasting problematic vs defensible AI-assisted workflows

· Hands-on small-group work (25 min) Participants apply AI-assisted prompts to their own anonymised data

· Ethics and governance discussion (15 min) Transparency, institutional policies, and publication expectations

· Integration and reflection (10 min) Consolidation and take-home frameworks

Target Audience

Health professions educators, qualitative researchers, doctoral candidates, and education leaders with an interest in qualitative or mixed-methods research.


Workshop 7: Designing gamified learning activities using online tools

Monday 29 June 2026
Location: Bealey B2, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenter: Dr J Douglas Miles
University of Hawai‘i John A. Burns School of Medicine

Introduction/Background

Gamification is being increasingly adopted in health professions education as a strategy to enhance learner engagement, motivation, and active participation. Although gamification has been increasingly adopted in educational contexts, the complexity of aligning specific game mechanics with learning objectives and educational context requires purposeful design. This workshop will explore how to use available online platforms to align game mechanics with pedagogical purpose.

Methods

This workshop adopts an experiential, design-based approach. Participants will first engage as learners with a small selection of online tools that can be used to build games for learners. These platforms will be used to highlight a variety of pedagogical options, such as individual versus team-based interaction, recall versus application, and rapid versus delayed feedback. Participants will then work in small groups to design a brief gamified learning activity aligned with a defined educational objective and learner context. The facilitator will guide discussion and support reflection on design choices throughout the session.

Results/Evaluation

This skills-focused workshop aims to increase participants’ confidence in designing purposeful gamified learning activities, support critical appraisal of gamification tools beyond surface engagement, and enable participants to adapt gamification strategies to their own educational settings.

Discussion

This workshop emphasizes transferable design principles rather than platform-specific solutions. By focusing on alignment between learning objectives, game mechanics, and context, the workshop is intended to help educators make intentional decisions about when and how gamification may add educational value. Participants will be encouraged to reflect on how gamified approaches can be adapted across disciplines, learner levels, and resource environments.


Workshop 8: Mitigating biases in performance-based assessments

Monday 29 June 2026
Location: Bealey B3, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenters: Dr Elizabeth Kachur, Nobutaro Ban and Francesco Bolstad
Medical Education Development, Global Consulting, Aichi Medical University and Nara Medical Unversity

Introduction/Background

Assessments are critical elements of every training endeavour. Their purpose can range from formative (for learning) to summative (of learning). Of course, we always aim for fair and accurate assessments of knowledge, skills and attitudes/values. But every performance evaluation requires observation and judgements which can easily fall victim to stereotyping and biases. Strategies for identifying and managing them will be the focus of this pre-conference workshop.

Biases are a human condition everybody is wired for. They help sort out and make meaning of the many stimuli that constantly flood our senses. While such psychological phenomena can be understood, they interfere with key assessment principles: validity and reliability. Thus, educators must control such noxious Influences as best as possible, from test development (e.g., OSCE stations) to training those who complete performance evaluations (e.g., rater training).

Methods

This interactive workshop will include a variety of activities to help participants identify bias risks and develop strategies for mitigating their influence (including AI tools). Case scenarios will offer opportunities for analysis, reflection and discussion. The handout will feature references, worksheets and opportunities to record new ideas and take-home-points.

Results/ Evaluation/ Aims

At the end of the workshop participants should be able to:

1) List 3 types of biases that are common in performance-based assessments

2) Identify bias risks when evaluating assessment tools (e.g., OSCE stations, Mini-CEX patient assignments)

3) Strategize interventions to mitigate the influence of biases on performance-based assessments

Discussion

Biases are part of the human condition. Benson and Manoogian identified 188 cognitive biases in their “Bias Wheel.” Nonetheless, they can derail our efforts to fairly and accurately identify learning successes and learning failures. While they cannot be eliminated completely, educators have a responsibility to identify bias risks and implement strategies that can help mitigate them.

References

Kachur E, Harter T. Wired for stereotyping and biases - Is there a professional way out of prejudicial behaviors and discrimination? J Commun Healthc. 2024 Dec;17(4):355-359. doi: 10.1080/17538068.2024.2427451. Epub 2024 Nov 14. PMID: 39540652.

Wood TJ. Exploring the role of first impressions in rater-based assessments. Adv Health Sci Educ Theory Pract. 2014 Aug;19(3):409-27. doi: 10.1007/s10459-013-9453-9. Epub 2013 Mar 26. PMID: 23529821.

Outline of workshop activities

5 min - Welcome, introductions, orientation (word clouds)

15 min - Key biases in performance-based assessments (brief presentation, polling)

15 min - Examples of bias interference with assessment accuracy (think-pair-share)

10 min - Bias identification and mitigation strategies (brief presentation to Include rater selection/training and potential AI tools)

20 min - Case-based intervention strategies (small group discussion of case examples, worksheet to record potential solutions)

15 min - Reporting out of small group discussion results

10 min - Summary and take-home-points


Workshop 9: Flexible Curriculum Design: Enabling Diversity, Inclusion, and Student Success

Monday 29 June 2026
Location: Bealey B4, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenter: Dr Jennifer Shone
University of Sydney

Introduction/Background

Health professional education programs are increasingly expected to offer flexible course design to meet evolving student needs, societal expectations, and accreditation requirements. Flexibility is particularly important for supporting students with caring responsibilities, disability, chronic illness, financial pressures, or other competing commitments. Balancing these needs with the demands of pre-clinical and clinical learning, while ensuring required competencies, remains a significant challenge. Drawing on survey findings from a Medical Deans Special Interest Group, this workshop explores contemporary approaches to flexibility in Australian medical programs, including attendance and leave policies, flexible timetabling, asynchronous learning, recognition of prior learning, and part-time pathways.

Methods

The workshop will provide an overview of the key drivers shaping flexible course design across Australia and New Zealand. Presenters will share case examples from Australian Medical Schools highlighting both facilitators and barriers encountered when implementing flexible course design.

In small groups, participants will discuss areas of need within their own programs and consider how these could be addressed through intentional, evidence informed design. The session will then move to a World Café, with groups rotating through themed tables on flexible course design. Participants will explore the benefits, limitations, and relevance of these approaches in their own contexts, building on shared ideas through successive rounds of dialogue

Aims

· To explore current approaches to flexible course design in Health Professions Education

· To identify flexible design strategies that can be adapted to participants’ local contexts

· To create opportunities for collaboration with colleagues nationally to support the implementation of flexible course design.

Discussion

Participants who attend this workshop will gain knowledge and practical strategies for flexible course design which can be implemented in their own institutions. Sharing of current innovative ideas will drive discussion and action to improve student experience and wellbeing, and support policy advocacy. Participants will have the opportunity to develop their professional networks to provide ongoing support and sharing of ideas following the conference.

References

Medical Deans Australia and New Zealand (2026) Flexible Medical Education Report.Medical Education Collaborative Committee (Unpublished), Sydney, Australia

Barrett A, Woodward-Kron R, Cheshire L (2022) Flexibility in primary medical programs: A scoping review. Focus on Health Professional Education. 23(4):16-34

Outline of workshop activities

The workshop will commence with an initial short plenary, outlining key themes uncovered by the Medical Deans Australia and New Zealand’s Flexible Medical Education working group. This will be followed on by hands on small group activities where participants will discuss challenges within their own programs and consider how these could be addressed through intentional, evidence-informed design. The session will then move to a World Café, with groups rotating through themed tables on flexible course design:

· Part-time course design

· Flexible attendance processes

· Condensed curriculum delivery with timetabled “private time”

· Flexible delivery for specific cohorts of students

Participants will explore the benefits, limitations, and relevance of these approaches in their own contexts, building on shared ideas through successive rounds of dialogue. World Cafe groups will feed back to the wider group verbally and via a Padlet, which will be available to participants after the workshop.


Workshop 10: Common Threads in Supervision: Intertwining Senior Expertise with Junior Potential Under Systemic Pressure

Monday 29 June 2026
Location: Bealey B5, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenter: A/Prof Fiona Moir
University of Auckland and Connect Communications

Introduction/Background

Senior clinicians and academics are under pressure, with fewer resources, staff and time, and increasing demands. Supervision of junior staff and students is frequently squeezed between heavy clinical caseloads, administrative overload and the ongoing demands of teaching and research outputs. Clinical supervision and interactions with struggling students can feel like a drain on time and energy, rather than opportunities for reciprocal learning and fostering growth. Sometime, this contributes to a sense of disconnection from authenticity and roles as health educators: a recipe for burnout. Given the impact on students and staff, it is surprising that many supervisors have not had access to practical training in supervision skills to turn this situation around.

Aim

This workshop is designed as a practical "reframe and reset." We aim to highlight the common threads that can make supervision effective and enjoyable in clinical and academic workplaces even when time is at a premium and, create an opportunity to refresh key supervision skills.

Method / Innovation

The session focuses on the core skills of supervision1 including strategies that work in a busy schedule2. Content will cover models such as Heron’s six styles of supervision1, including practice applying some relevant communication skills techniques and useful phrases, and experimenting with frameworks for giving feedback and maintaining boundaries. Focussed, skilful engagement can be more valuable than distracted corridor catch-ups or hour-long meetings. The workshop also includes a "train-the-trainer" element, providing participants with a resource to lead similar sessions back in their own workplace.

Results/ Oucomes

Participants will leave with a practical toolkit for clinical and academic supervision and a template for professional development for their colleagues.

Conclusion

By advancing how we connect as supervisors with the next generation, we can protect our own wellbeing, improve relationships and productivity and contribute to the long-term sustainability of the health workforce

References

1, Hawkins, P., & McMahon, A. (2020). Supervision in the helping professions (5th ed.). Open University Press.

2 Hamersvelt, H., Schoonhoven, L., Hoff, R. G., Ten Cate, O., & Hennus, M. P. (2023). Clinical supervision under pressure: a qualitative study amongst health care professionals working on the ICU during COVID-19. Medical Education Online, 28(1), 2231614.

Workshop Activities

Reflect, Connect, Protect:

Using a real-time anonymous feedback app, we’ll acknowledge and share current supervision challenges in health and academia in Aotearoa and Australia, whilst connecting the group and warming up to the topic. We’ll then identify the non-negotiable boundaries needed to protect the supervisor-supervisee relationship.

View and Do:

Participants will use video prompts for quick, low-stakes practice of specific tactics and phrases in a supportive “communication circuit”. Topics could include efficiently coaching a junior clinician after a challenging encounter and helping a student navigate a "roadblock" in a 10-minute check-in. We will find the "common threads" of providing a psychologically safe and supportive environment in which to provide feedback using a framework.

Boundary Balance:

Choosing to work individually or in small groups, participants will apply a 4-step boundary framework to a written ‘case’. Being tactfully tough, i.e. “kind but firm”, allows you to say yes to what matters, in a sustainable way.

The Ripple Effect:

We’ll end with exploring how participants can take these skills home and disseminate them to other supervisors. We’ll brainstorm how to advocate for these "protected pockets" of supervision time with leadership, ensuring the skills learned here have a lasting impact in their own institutions.


Workshop 11: Unlearning Emotional Detachment in Clinical Practice

Monday 29 June 2026
Location: Dobson 1, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenter: Dr Bruce Lister
The University of Queensland

Introduction/Background

Within medical culture, emotional composure is valorised as evidence of professionalism. Over time, this expectation may produce “socially acquired alexithymia”; a pattern of habitual emotional suppression and detachment that mirrors trait alexithymia but arises through socialisation and is reinforced through cultures of practice throughout the medical career. When this occurs, clinicians may appear emotionally disengaged despite preserved emotional awareness. Recognising and addressing these belief based patterns is essential for sustaining empathy, reflection, and well being.This pre conference workshop positions socially acquired alexithymia as a modifiable outcome of professional learning rather than a fixed deficit. It bridges emotion science, medical education, and cultural theory to help educators and clinicians identify, understand, and transform belief systems that inadvertently reward emotional restraint.

Methods

The 90 minute interactive session begins with an evidence informed overview of recent empirical work exploring alexithymia, emotion beliefs, and regulation profiles in physicians. Through analysed vignettes and facilitated dialogue, participants examine how the hidden curriculum shapes emotional display norms and how “composure” becomes conflated with competence. Small group activities prompt reflection on personal and institutional emotion beliefs, followed by co design of strategies to normalise authentic emotional expression within professional practice.

Aims

•  Increase understanding of socially acquired alexithymia as an educational and organisational phenomenon.

• Explore how professional norms shape emotion beliefs and regulation.

• Develop practical approaches to re humanising emotional experience in training and practice

Outline of workshop activities

Welcome, framing, and psychological safety (5 mins)

The workshop opens by establishing psychological safety and shared purpose. Facilitators introduce the premise that emotions in medical practice are constructed rather than automatic, and that patterns of emotional detachment can emerge through professional socialisation. Ground rules are set, and participants briefly reflect on emotionally challenging moments in teaching or clinical work to orient them to the session’s focus.

Conceptual input: constructed emotion and emotional profiles (15 mins)

A concise, interactive overview introduces key ideas from affective science, including emotions as context-dependent constructions and the role of emotion concepts, bodily cues, and social norms. Recent empirical work identifying distinct emotional profiles in physicians is presented, including socially acquired alexithymia as a modifiable educational outcome rather than a fixed trait.

Individual self-reflection: locating emotional profiles (15mins)

Participants complete a brief reflective self-check based on emotional awareness, emotion beliefs, and regulation tendencies. Using a simple visual map, they locate themselves within simplified emotional profiles and consider how these patterns may shape their teaching, supervision, or clinical interactions.

Small-group analysis: constructing emotion in practice (20 mins)

Working in small interdisciplinary groups, participants analyse short anonymised clinical-education vignettes using a constructed emotion lens. Groups identify how bodily signals, emotion concepts, and professional norms shape emotional experience and expression, and how hidden curricula may narrow emotional repertoires.

Design sprint: developing micro-interventions (25 mins)

Groups identify leverage points where emotional narrowing occurs (e.g. suppression equated with competence) and co-design brief, context-specific micro-interventions such as coaching questions, debrief prompts, or reflective teaching practices. Each group produces a low-fidelity intervention suitable for immediate use in their own educational context.

Synthesis, commitments, and close (10 mins)

Groups briefly share key ideas in a plenary “harvest.” Participants identify one actionable change they will implement in their teaching or supervision. Facilitators summarise key take-home principles and provide resources to support transfer to practice.


Workshop 12: Coalitions for Change: Co-production in HPE to strengthen alignment and connection

Monday 29 June 2026
Location: Dobson 2, Te Pae Christchurch Convention Centre
11:00 - 12:30Presenter: Dr Julia Paxino
University of Melbourne

Introduction/Background

Co-production in health professions education (HPE) and associated research has emerged as a valuable approach for bringing together diverse professional, lived, and learned knowledge to enrich learning and enhance its relevance, quality, and impact. Despite its promise, practical, cultural, and systemic barriers often limit the implementation of co-production. Coalition building provides a guiding framework to navigate these barriers, emphasizing reflexivity, transversal politics, solidarity, allyship, and transformative learning, all of which support dialogue across roles and disciplines and power sharing toward more equitable and inclusive approaches to knowledge generation and dissemination. This workshop explores practical approaches for applying coalition-building principles to support meaningful co-produced HPE and associated research.

Methods

This workshop is grounded in transformative learning theory, emphasising reflexivity and learning from diverse perspectives. Facilitators will introduce co-production, guide participants in collaboratively exploring its values and barriers, and provide a theoretical foundation for coalition-building and its associated principles. Through interactive discussions and activities, participants will explore communication strategies for coalition-building, including how to establish shared goals and expectations, navigate power dynamics, and maintain momentum across a project’s trajectory.

Results/Evaluation

By the end of the workshop, participants will be able to:

• Explain how coalition-building principles support effective co-production in HPE and associated research.
• Apply communication strategies to establish shared purpose, negotiate priorities, and build alignment across diverse partners.
• Select and justify appropriate communication channels and routines (formal/informal; synchronous/asynchronous) to support trust, momentum, and accountability across a project lifecycle.

Discussion

Participants will gain practical skills and conceptual insights to foster safer, more inclusive, and collaborative HPE environments. Emphasizing coalition-building principles equips HPE teams to navigate barriers, integrate diverse perspectives, and foster transformational change. The workshop supports participants in translating these strategies into real-world contexts, strengthening alignment with co-production values and advancing equity, innovation, and impact.

References

Taylor, L., & Keating, C. (2023). Coalitional Pedagogy: Educating for Intersectional Social Justice.Ballard, D. I., Mandhana, D. M., & Tesfai, Y. (2025). Groups, teams, and decision-making. In Origins, Traditions, and Trends of Organizational Communication (pp. 293-313). Routledge.

Outline of workshop activities

1. Welcome and Introduction (10 minutes): 

Facilitators will introduce workshop goals and briefly present key concepts in co-production and coalition-building, highlighting their relevance in HPE and associated research. An interactive activity will surface participants’ prior experiences with co-production.

2. Exploring Co-Production (20 minutes): 

Participants will collaboratively explore the values, benefits, and barriers of co-production. Facilitators will provide a theoretical and practical foundation for coalition building, emphasizing reflexivity, transversal politics, solidarity, allyship, and transformative learning principles.

3. Collaborative Problem-Solving and Application (50 minutes):

Participants will engage in small- and large-group activities using real-world cases from HPE and associated research. Through applied reflection and hands-on exercises, they will explore how to apply coalition-building principles to map a communication strategy that establishes shared purpose, negotiates priorities, and builds alignment across diverse partners, including people with lived experience. These activities will support experimentation with strategies to foster innovation, enhance collaboration, and co-create new knowledge across diverse perspectives.

4. Synthesis and Takeaways (10 minutes):

The group will engage in collective sense-making to consolidate key insights and practical communication strategies for applying coalition-building principles in co-produced HPE and associated research, supporting transformational change and inclusive, equitable collaboration.


Workshop 13: Thinking like a clinician: Using tabletop simulation to support practice-ready graduates

Monday 29 June 2026
Location: Bealey B2, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: Dr Dayna Duncan
Flinders University

Introduction/Background

Tabletop role play provides participants with experience solving clinical dilemmas in a team. It’s been used in critical care to prepare clinicians for mass casualty scenarios, and in nursing education to foster leadership skills (Cross, 2024; Rashed et al., 2025). At the Flinders Northern Territory Medical Program, this format has been used to simulate the cognitive process of prescribing. The tabletop format uses social learning that mirrors the teamwork of the clinical environment while creating a classroom-based scaffold, free from extraneous stressors. Although the physical setting is not authentic, the materials are designed for high fidelity in the cognitive process of prescribing, enabling learners to gain insight into their current capabilities. This experiential learning activity proved to be low-resource and was associated with improved student self-perceived preparedness for prescribing. This format is adaptable and can be applied across levels of professional training.

Methods

Following a brief introduction of the tabletop simulation format, its pedagogical underpinnings, and its application to prescribing education, participants will work in small groups to design a tabletop simulation relevant to their field. Groups will consider the clinician’s process of information gathering, decision-making, documentation and communication, and develop materials using provided forms and templates. Each group will then pass their task to another table, who will complete the activity and provide feedback in a structured format. Participants will then reconvene to discuss the opportunities and challenges of implementing tabletop simulation in their own educational contexts.

Aims

This workshop aims to improve participants’ awareness of the tabletop simulation as an educational tool, deepen understanding of its pedagogical foundations, and empower attendees to apply this approach in their own settings.

Discussion

Participants will leave with an understanding of tabletop simulation, its theoretical basis, and practical skills to design and implement an activity relevant to their context.

References

Cross, C. R. (2024). Effectiveness of a tabletop simulaiton activity on senior-level BSN students' clinical reasoning, clinical decision-making and clinical judgement scores. William Carey University]. ProQuest Dissertations & Theses. 

Rashed, A. L., Cherukuri, A., Seu, R., Taubman, C., Jamal, J., Guha, D., Ahmed, O., Melgar, J., Kardashian-Sieger, T., Rummaneethorn, N., Restivo, A., Yoon, A., Gartenberg, A., & Singh, M. (2025). Comparing Tabletop and High-Fidelity Simulation for Disaster Medicine Training in Emergency Medicine Residents. Disaster Medicine and Public Health Preparedness, 19. https://doi.org/ARTN e276 10.1017/dmp.2025.10206 


Workshop 14: Demystifying Assessment Psychometrics: Practical Tools for Interpreting and Using Your Data

Monday 29 June 2026
Location: Bealey B3, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: Dr Vikki O'Neill
Queen's University Belfast

Introduction/Background

Psychometric analysis is often viewed as technical, opaque, or reserved for specialists. However, the principles underpinning reliability, validity, standard setting, and item performance are fundamental to all forms of assessment, from written examinations to clinical and workplace-based formats. A clear understanding of these concepts is essential for defensible decision-making, quality assurance, and meaningful feedback to learners and stakeholders. This workshop aims to make psychometrics accessible, practical, and directly applicable to participants’ own assessment contexts.

Methods

This session will focus on essential post-assessment metrics that support quality assurance and informed decision-making. Participants will examine reliability indices (including internal consistency), item-level statistics such as difficulty and discrimination, and the interpretation of score distributions. The relationship between blueprinting, sampling, and measurement precision will be examined, alongside the implications of standard setting approaches for fairness and defensibility.

Using anonymised datasets and worked examples, participants will interpret psychometric outputs and discuss what constitutes a meaningful concern versus normal statistical variation. Common challenges, including small cohorts, borderline performance, and poorly performing items, will be discussed. The emphasis will be on understanding what the numbers mean, how they relate to assessment design, and how they can inform constructive action rather than on complex calculation.

Results/Evaluation

Through facilitated discussion and shared interpretation, participants will develop a clearer understanding of how psychometric indicators relate to assessment design and cohort characteristics. The workshop prioritises practical interpretation and application. Attendees will leave with increased confidence in reading psychometric reports, recognising key indicators of quality, and responding proportionately to issues such as low reliability and item anomalies, while considering how data can enhance assessment quality and student feedback.

Discussion

The principles discussed are transferable across disciplines and assessment formats, making this workshop relevant to educators involved in designing, delivering, or evaluating high-stakes and programme-level assessment.


Workshop 15: Learning and practising health care in a second language

Monday 29 June 2026
Location: Bealey B4, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: Prof Francesco Bolstad
Nara Medical University

Introduction/Background

English has become firmly established as the lingua franca of science and medicine, and proficiency in English is often taken for granted as a prerequisite for participation in medical education and health care—not only in English-speaking countries, but globally. However, language is far more than vocabulary and grammatical structures. Language shapes how meaning is constructed, how relationships are negotiated, and how sociocultural norms govern professional communications.

This interactive workshop invites participants to develop instructional tools for supporting cross-language communications in patient care and teaching. Drawing on the facilitators’ experience with English communication curricula, OSCEs, global communities of practice events and international debate programmes, this session will examine how verbal and nonverbal language can connect or divide in a professional context.

Methods

Participants will engage in guided discussions designed to strengthen observation skills and to develop educational strategies that enhance learners' cross-lingual competence in educational and clinical contexts.  Polling will help identify audience background and expertise.  A handout will provide references, worksheets and opportunities to record learning gains.  

Results/Evaluation

By the end of the session participants should be able to 

1) Discuss the unifying and dividing potential of language in educational and clinical contexts

2) Describe how linguistic competence and confidence can affect educational and clinical encounters

3) Plan educational interventions to prepare health profession trainees for multi-language encounters

Discussion

The importance of language in educational and clinical encounters cannot be underestimated.  It is a critical element for practicing in a globalized environment and for advancing any professional field.  This workshop aims to enhance awareness of instructional opportunities that embrace the saying: Whakarongo kia mōhio; kōrero kia mārama — listen in order to understand; speak in order to be understood.  

References

Hamad AA, Mustaffa DB, Alnajjar AZ, Amro R, Deameh MG, Amin B, Alkhawaldeh IM. Decolonizing medical education: a systematic review of educational language barriers in countries using foreign languages for instruction. BMC Med Educ. 2025 May 13;25(1):701. doi: 10.1186/s12909-025-07251-2. PMID: 40361088; PMCID: PMC12077016.

Zabar S, Hanley K, Kachur E, Stevens D, Schwartz MD, Pearlman E, Adams J, Felix K, Lipkin M Jr, Kalet A. "Oh! She doesn't speak English!" Assessing resident competence in managing linguistic and cultural barriers. J Gen Intern Med. 2006 May;21(5):510-3. doi: 10.1111/j.1525-1497.2006.00439.x. PMID: 16704400; PMCID: PMC1484779.

Outline of workshop activities

10 min - Welcome, introductions, orientation (polling, word clouds)
20 min - Cross-language communication challenges (Think-Pair-Share)
15 min - Sample educational programs to enhance language proficiency and promote communicative flexibility - what works? (brief presentations about English curricula for medical students, OSCE stations for working with interpreters, World Café for health profession educators)
20 min - Preparing learners for cross-language encounters in educational and clinical contexts (small group case discussion to explore intervention options that address knowledge, skills and attitudes/values with the help of worksheets)
15 min - Reporting out of small group discussion results
10 min - Summary and take-home-points


Workshop 16: Weaving actionable feedback into performance assessment: supporting student learning

Monday 29 June 2026
Location: Bealey B5, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: Prof Katharine Boursicot
Health Professional Assessment Consultancy

Introduction/Background

Performance assessments provide key information towards determining student competence within health professional education. These assessment episodes can be leveraged to support student learning through a best practice feedback approach.

Weaving actionable feedback activity routinely into performance assessments is important in engaging learners to reflect on their learning trajectory and enhance their skills.

Faculty development for teachers is critical to the success of a good program of feedback; this workshop aims to upskill teachers in best practice feedback based on research evidence.

Methods

This workshop will be structured to support participants with a range of levels of experience by leveraging the expertise in the room. An introductory discussion describing participants’ experiences will be followed by a short presentation on evidence-based feedback, and then individual and group activities to practice the recommended approaches. A final summary will tie together the different perspectives shared and insights gained.

Aim

1. Discuss research evidence for best feedback practice in performance assessment

2. Apply best practice feedback for performance assessment

3. Analyse opportunities and challenges in the implementation of a best practice feedback approach

References

Boursicot K, Kemp S, Wilkinson T, Findyartini A, Canning C, Cilliers F, Fuller R. Performance assessment: Consensus statement and recommendations from the 2020 Ottawa Conference. Med Teach. 2021 Jan;43(1):58-67. doi: 10.1080/0142159X.2020.1830052. Epub 2020 Oct 14. PMID: 33054524.

Hattie, J., & Timperley, H. (2007). The Power of Feedback. Review of Educational Research, 77(1), 81-112. https://doi.org/10.3102/003465430298487 

Outline of workshop activities

1. Commence workshop with introductions/ice breaker and provide safe space for participants to share their experiences of feedback in performance assessment (10 min)
2. Brief presentation describing research evidence for best practice and feedback as dialogue, linking concepts to those raised by participants in (1) (15 min)
3. Facilitated discussion inviting participants to analyse best practice approaches (15 min)
4. Individual and group activities – participants will develop feedback and work in small groups to share, providing metacognitive feedback on feedback, with roaming facilitators contributing (20 min)
5. Facilitators will reconvene the group to hold a discussion about lessons and insights from the activity, and to include the role of technology in operationalisation of a best practice feedback approach (20 min)
6. A final summary will link all elements of the workshop to leverage shared and competing perspectives (10)


Workshop 17: Building a culture of restorative practice in medical education: preliminary reflections from a student well-being initiative

Monday 29 June 2026
Location: Dobson 1, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: Miss Kayla Young
University of Otago

Introduction/Background

Psychometric analysis is often viewed as technical, opaque, or reserved for specialists. However, the principles underpinning reliability, validity, standard setting, and item performance are fundamental to all forms of assessment, from written examinations to clinical and workplace-based formats. A clear understanding of these concepts is essential for defensible decision-making, quality assurance, and meaningful feedback to learners and stakeholders. This workshop aims to make psychometrics accessible, practical, and directly applicable to participants’ own assessment contexts.

Methods

This session will focus on essential post-assessment metrics that support quality assurance and informed decision-making. Participants will examine reliability indices (including internal consistency), item-level statistics such as difficulty and discrimination, and the interpretation of score distributions. The relationship between blueprinting, sampling, and measurement precision will be examined, alongside the implications of standard setting approaches for fairness and defensibility.

Using anonymised datasets and worked examples, participants will interpret psychometric outputs and discuss what constitutes a meaningful concern versus normal statistical variation. Common challenges, including small cohorts, borderline performance, and poorly performing items, will be discussed. The emphasis will be on understanding what the numbers mean, how they relate to assessment design, and how they can inform constructive action rather than on complex calculation.

Results/Evaluation

Through facilitated discussion and shared interpretation, participants will develop a clearer understanding of how psychometric indicators relate to assessment design and cohort characteristics. The workshop prioritises practical interpretation and application. Attendees will leave with increased confidence in reading psychometric reports, recognising key indicators of quality, and responding proportionately to issues such as low reliability and item anomalies, while considering how data can enhance assessment quality and student feedback.

Discussion

The principles discussed are transferable across disciplines and assessment formats, making this workshop relevant to educators involved in designing, delivering, or evaluating high-stakes and programme-level assessment.


Workshop 18: Writing for Publication

Monday 29 June 2026
Location: TBC, Te Pae Christchurch Convention Centre
13:30 - 15:00Presenter: FoHPE Editors: Jodie Copley; and Editorial Board Members: Andy Wearn, Tim Wilkinson, Karen Scott, Svetlana King

Introduction/Background

This workshop is part of the regular program at ANZAHPE conferences. The Association is keen to assist its members in developing skills in health professions education research and publication. In particular, it seeks to encourage and upskill early career academics. The workshop is led by Editorial Board members of the Association’s journal, Focus on Health Professional Education (FoHPE).

Aims

• Learn about the reviewing and publishing process, using FoHPE as a case example.

• Develop skills in framing and writing up your research project to optimise its chances of publication.

Activities

A mixture of short presentations and small group work will be used to examine the key features of well-designed research studies and the most effective ways of presenting completed studies in manuscripts submitted for publication. Participants will be asked to bring a draft of a manuscript they are currently writing up or research data they are wanting to write up for publication.

Facilitators: The FoHPE Editors and Associate Editors

Intended participants: This workshop is intended for early/mid-career health professions education researchers who are keen to increase their publication success. This includes those who are currently writing for publication for the first time and those who have had a small number of papers published.


Workshop 19: Seeing the Whole System: Applying Systems Evaluation Theory (SET) in Health Professions Education Research and Evaluation

Monday 29 June 2026
Location: Bealey B2, Te Pae Christchurch Convention Centre
15:30 - 17:00Presenter: Dr Imogene Rothnie
Royal Australasian College of Physicians

Introduction/Background

Health professions education (HPE) increasingly operates within complex systems shaped by interdependent processes, diverse stakeholders, and overlapping structures. Our experience is that understanding these systems is best served by tools that move beyond linear (e.g., program logic models) or component-level analysis. Systems Evaluation Theory (SET) (Renger R. , 2015) (Renger R. , 2022) provides a practical, structured framework for analysing system purpose, subsystems, boundaries, interdependence, feedback, adaptation, and emergent behaviours.

This interactive workshop introduces participants to SET and guides them in applying foundational system concepts to their own educational contexts. Short demonstrations and facilitated mapping activities support participants to define a system, identify its subsystems and components, and examine how boundaries and interactions shape performance.

Learning outcomes: By the end of the workshop, participants will be able to: (1) define a system in their own HPE context and identify relevant subsystems, components, and boundaries; (2) explain and apply key system principles—interdependence, boundaries, feedback, adaptation, and emergent properties; (3) distinguish between system efficiency (coherence, minimising duplication) and effectiveness (achievement of purpose, including emergent outcomes); and (4) formulate a SET-informed research, evaluation, or improvement question using a practical mapping template.

By foregrounding how system components interact to produce both intended and unintended outcomes, SET offers HPE scholars a powerful lens for analysing complexity, diagnosing system-level challenges, and identifying leverage points for improvement. Integrating systems thinking strengthens evaluative rigour, supports better decision-making, and enhances responsiveness to dynamic training environments.

We encourage health professions education researchers and academics to attend this workshop to build the system thinking capabilities urgently needed in today’s complex training environments. As programs grow more interconnected and adaptive, researchers require tools that illuminate interdependence, boundary tensions, and emergent system behaviours. This workshop offers a practical starting point for developing these skills and advancing rigorous, system aware scholarship in HPE.

References

Renger, R. (2015). System evaluation theory (SET): A practical framework for evaluators to meet the challenges of system evaluation. Evaluation Journal of Australia, 15(4), 16-28.

Renger, R. (2022). System Evaluation Theory: A blueprint for practitioners evaluating complex interventions operating and functioning as systems. Charlotte: Information Age Publishing.

Outline of workshop activities

This workshop provides a concise, practice oriented introduction to Systems Evaluation Theory (SET) tailored for health professions education researchers, program evaluators or anyone interested in exploring complex interventions acting as a system.  Participants begin by defining a system and its purpose in their own work environment, such as a selection process, learning program or assessment system and identifying its key subsystems, components, and their boundaries and interactions.

Facilitators then introduce core system principles, including interdependence (how subsystem interactions shape system performance), boundaries (structural, temporal, professional), feedback (information flow and influence), adaptation (responses to internal and external pressures), and emergent properties (outcomes not predictable from individual parts).

Participants work in small groups to map these principles onto their own educational contexts. They will then consider how to frame evaluation or research questions in terms of efficiency (coherence, alignment, reduction of duplication) and effectiveness (achievement of purpose, equity considerations, and emergent system behaviour).

The workshop concludes with each participant drafting a SET-informed research or evaluation question. They also receive a simple, reusable template for applying SET to future HPE projects, ensuring they leave with practical skills for analysing and improving the complex systems in which they work.


Workshop 20: From Search to Synthesis: Mastering Literature Review with AI Tools in Health Professions Education

Monday 29 June 2026
Location: Bealey B3, Te Pae Christchurch Convention Centre
15:30 - 17:00Presenter: Dr Afreenish Malik
Health Services Academy

Introduction/Background

High-quality research in Health Professions Education (HPE) depends on rigorous literature searching, critical appraisal, and evidence synthesis. However, many educators and researchers struggle with inefficient search strategies, information overload, and difficulty identifying seminal work and research gaps. The rapid emergence of Artificial Intelligence (AI)–powered research tools offers new opportunities to transform how literature reviews are conducted. These tools can support faster discovery of relevant studies, visualization of research networks, identification of influential papers, and rapid evidence synthesis. Despite this potential, most faculty and postgraduate researchers lack practical training in integrating AI tools into scholarly workflows. This workshop addresses this gap by equipping participants with practical skills to combine traditional database searching with AI-enabled tools for efficient, structured, and evidence-based literature reviews in HPE research.

Methods

This interactive workshop uses a blended demonstration and group-based exploration model. The facilitator will first demonstrate principles of effective literature search, including database strategies, keywords, Boolean operators, and screening approaches. Participants will then be introduced to five AI research tools (Elicit, Research Rabbit, Connected Papers, Semantic Scholar, and Consensus). Attendees will be divided into small groups, with each group assigned one AI tool to explore using a common research topic. Guided tasks will enable participants to test the tool’s features for discovering, mapping, summarizing, and synthesizing literature.

Results/Evaluation

Participants will develop hands-on skills in conducting efficient literature searches, identifying seminal papers, mapping research trends, and generating evidence summaries. By the end of the workshop, participants will be able to integrate AI tools into their research workflow and produce a structured approach to literature review.

Discussion

Integrating AI into literature searching enhances research efficiency, reduces cognitive load, and promotes evidence-based scholarship. This workshop prepares educators and researchers to adopt future-ready research practices in HPE.

Outline of workshop activities

The workshop will begin with a facilitator-led session introducing principles of effective literature searching, including database use, keyword strategy, Boolean operators, and screening techniques. A live demonstration will show how traditional searches can be optimized before introducing AI tools.

Participants will then be divided into five groups. Each group will be assigned one AI tool: Elicit, Research Rabbit, Connected Papers, Semantic Scholar, or Consensus. Using a shared research topic, groups will explore how their assigned tool helps in identifying relevant studies, visualizing connections between papers, summarizing findings, and detecting research gaps.

Each group will be provided with a short instructional video for their assigned AI tool. Participants will first watch the video to understand the features and workflow of the tool, then apply this learning hands-on to the shared research topic. Each group will then present their tool to all participants, demonstrating how it can be integrated into literature review workflows.

The session will conclude with a synthesis discussion led by the facilitator, comparing tools and outlining a combined strategy for AI-assisted literature searching in Health Professions Education research.



Workshop 22: A ‘How-to-Guide’ for Interdisciplinary Research Collaborations in post-graduate entry to practice health professional education: Acceptable or Aspirational?

Monday 29 June 2026
Location: Bealey B5, Te Pae Christchurch Convention Centre
15:30 - 17:00Presenter: Dr Rachel Toovey
The University of Melbourne

Introduction/Background

Implementing interdisciplinary research collaborations in health professional education contexts are known to face disciplinary, practical, structural and institutional barriers. While implementation strategies (e.g., How-to-Guides) can potentially overcome known challenges, they also face barriers, especially, user resistance due to low acceptability. 

Methods

In 2024, through literature synthesis and co-design with a project advisory group, current and alumni graduate health science students, academic staff and community experts, we developed a Framework for Interdisciplinary Learning in Research and Evaluation. In 2025, we then piloted and evaluated five interdisciplinary research capstone projects involving staff and students across health sciences entry to practice graduate courses including Audiology, Nursing, Optometry, Physiotherapy, Public Health Social Work and Speech Pathology disciplines. Data collection and analysis included a mix of surveys, interviews and focus groups with project advisory group, current and alumni graduate health science students, academic staff and community experts.

Results/Evaluation

The pilot interdisciplinary research projects were valuable and met the key purposes as an opportunity to develop interdisciplinary communication and teamwork skills as well as apply research theory into practice. A key recommendation for future implementation included: creating a ‘How-to guide’. We have developed now a ‘How to Guide’ to support students and staff involved in interdisciplinary research capstone projects, and to strengthen their capacity for practice-based interdisciplinary collaboration.  

Discussion

Future work will need to evaluate the implementability, especially, the acceptability of interdisciplinary learning initiatives, and of specific implementation strategies including the How-to-Guide.  While acceptability (the extent to which users consider interventions to be appropriate) can be assessed before (prospectively), during, or after (retrospectively) intervention delivery, assessing prospective acceptability is increasingly recognised as critical step in pre-implementation planning of evidence-informed intervention in practice.

References

Visscher-Voerman et al (2025). Enhancing interdisciplinary education: insights from a comprehensive review. European Journal of Engineering Education, 1–25.

Sekhon M et al (2017). Acceptability of healthcare interventions: an overview of reviews and development of a theoretical framework. BMC Health Serv Res.17(1):88.

Outline of workshop activities

Workshop Objectives:
  • To create a space to share expertise about interdisciplinary research collaborations in health education contexts
  • To inform all about the importance of assessing prospective acceptability of evidence-informed interventions (e.g., How-to-Guide) 
  • To discuss challenges and opportunities for how to apply prospective acceptability as a research method in implementation planning.

Workshop participants will learn about:

  • Interdisciplinary research collaborations for health science graduate students
  • Co-designing, implementing and evaluating a Framework and Roadmap for practical interdisciplinary learning in research and evaluation
  • A ‘How- to-Guide’ designed to inform, guide and support graduate health science students and staff involved in year-long interdisciplinary research capstones
  • Assessing the prospective acceptability using the Theoretical Framework for Acceptability and its seven constructs
  • Assessing the prospective acceptability of evidence-informed interventions in their work contexts

Workshop Strategies 

A Community Roundtable discussion format will create a fun and interactive session for all - that evolves throughout the 90 minutes. Each discussion round will build upon each other with new emerging insights. This will include: 

  1.  Welcome and Introductions (10mins)
  2. Context Setting (Interdisciplinary research collaborations, the ‘How-to-Guide’, and the Theoretical Framework for Acceptability (TFA) (15mins)
  3.  Three Rounds of 15min discussions (using adapted TFA constructs) of the acceptability of the ‘How-to-Guide’ (45mins)
  4. Roundtable Report back (10mins)
  5. Next Steps (10mins). 


Workshop 23: Weaving Evidence into Education: Collaborative Program Evaluation in Health Professions Training

Monday 29 June 2026
Location: Dobson 1, Te Pae Christchurch Convention Centre
15:30 - 17:00Presenter: Dr Kimberly Dahlman
Vanderbilt University Medical Center

Introduction/Background

Program evaluation is the systematic assessment of educational initiatives, programs, and interventions to determine effectiveness, efficiency, relevance, and impact on trainees, educators, and communities. By collecting and analyzing quantitative and qualitative data, evaluators generate evidence-based feedback about a program’s strengths and areas for improvement, supporting decisions about refinement, expansion, or discontinuation. Because educational programs are embedded in complex institutional and community contexts, collaborative approaches to evaluation are essential for sustained success and meaningful use of results.

Methods

In this interactive pre-conference workshop, an experienced program evaluation team will guide participants through foundational elements of collaborative program evaluation and practical tools, such as logic models, which can be applied immediately in their home institutions. Participants will be introduced to core evaluation concepts and will receive adaptable templates and step-by-step strategies for planning and implementing evaluations aligned with intended short- and long-term outcomes. Learning will be reinforced through structured small- and large-group discussions in which participants work through common evaluation obstacles and explore opportunities to strengthen program design and assessment plans. Facilitators will provide feedback and coaching as participants draft evaluation components.

Results/Evaluation

At the end of the session, participants will develop and apply a logic model to their own programs to align resources, activities, outputs, outcomes, and intended impacts. Attendees will leave with a draft logic model and supporting templates that can be implemented or adapted for program evaluation and continuous improvement, along with strategies for guiding teams in the development of logic models and program evaluation plans.

Discussion

Collaborative program evaluation supports long-term program success by promoting shared understanding of goals, increasing stakeholder buy-in, and improving the likelihood that findings will be used. Logic models serve as a practical tool that connects program operations to intended outcomes, enabling a holistic evaluation that strengthens both program implementation and demonstrated impact.

References

W.K. Kellogg Foundation. 2004. Logic Model Development Guide. Updated Ed.

Van der Vleuten, et al. (2018). Programmatic assessment: the process, rationale and evidence for modern evaluation approaches in medical education. Medical Education 209(9).  

Outline of workshop activities

  • 15 min: Overview of collaborative program evaluation strategies and an introduction to logic models (with a guided demonstration).
  • 65 min: Individuals will develop a logic model for their own programs. This will be supported by coaching by workshop facilitators and small and large group discussions. Strategies for engaging in this development process with others in their home institutions will be shared and discussed throughout the session.
  • 10 min: Large group reflective discussion, question and answer, and commitment to change.



Masterclass


Masterclass 1: ANZAHPE-AMEE Essential Skills in Health Professions Education Leadership and Management (ESMELead)

Monday 29 June 2026
Location: Dobson 2, Te Pae Christchurch Convention Centre
13:30 - 17:30                  

    

Course Summary

The ESMELead Masterclass introduces key aspects of leadership and management for health professions educators who wish to develop a deeper understanding of leadership and management theory so they can improve their leadership skills and approaches to be more effective. The half-day Masterclass will be delivered at the ANZAHPE 2026 Conference for conference delegates attending in person. The workshop is theory informed, practice driven, context specific, highly interactive, supportive, and fun. The course language is English, but the pace will be suitable for participants whose first language is not English. All participants will receive a Certificate of Attendance.


Who should participate in this course: This course is for anyone (at any level) involved in health professions education who wants to learn more about leadership and management in health professions' education (in the academic or clinical setting) and explore the evidence base to help them become more effective leaders, managers, and followers.  


Course faculty:

Professor Kirsty Forrest MBCHB, BSc Hons, FRCA FAcadMEd, MMEd, FANZCA, Professor of Medical Education and Dean of Medicine, Bond University, Gold Coast, Australia. Executive Member and Treasurer of the Medical Deans of Australia and New Zealand (MDANZ) and Chair of the Medical Education Collaborative Committee.

Associate Professor Jo Bishop BSc (Hons), PhD, PGCertEd, FANZAHPE is the current ANZAHPE Global Engagement Chair, Head of Curriculum for the Bond Medical Program within the Faculty of Health Sciences and Medicine, Bond University, Gold Coast, Australia, Co-Chair of the Student Support Network, Medical Deans of Australia and New Zealand (MDANZ) and Deputy chair of the National Doctors Health & Wellbeing Leadership Alliance.


Learning outcomes

Through participating in the half day Masterclass, delegates will be able to:

  • Demonstrate understanding of leadership in contemporary health professions education
  • Define key concepts relating to educational leadership, management and followership 
  • Explore strategies for leading and managing change
  • Apply this learning to their own practice and context


Course Format

The curriculum is organised as follows:

Synchronous Masterclass:   The course will be delivered as a half-day masterclass for face-to conference delegates. The course language is English, but the pace will be suitable for participants whose first language is not English. All participants will receive a Certificate of Attendance.


This Masterclass is recognised as part of AMEE's ESME 'Essential Skills' portfolio. Participants wishing to pursue further studies to receive an ESME certificate are eligible for a discount on qualifying courses. Contact the AMEE team on courses@amee.org for more details of current and forthcoming certificate courses and fees.



  

Follow us on social media

Conference Manager: Event Studio | Privacy Policy  |  Contact