Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates

Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that in...

Full description

Bibliographic Details
Main Authors: Debra Nestel, Melanie Regan, Priyanga Vijayakumar, Irum Sunderji, Cathy Haigh, Cathy Smith, Alistair Wright
Format: Article
Language:English
Published: Korea Health Insurance Licensing Examination Institute 2011-12-01
Series:Journal of Educational Evaluation for Health Professions
Subjects:
Online Access:http://www.jeehp.org/upload/jeehp-8-13.pdf
id doaj-8f6b1cc976794ab8b0ed12b12aa8934a
record_format Article
spelling doaj-8f6b1cc976794ab8b0ed12b12aa8934a2020-11-25T00:59:47ZengKorea Health Insurance Licensing Examination InstituteJournal of Educational Evaluation for Health Professions1975-59372011-12-0181310.3352/jeehp.2011.8.1348Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduatesDebra NestelMelanie ReganPriyanga VijayakumarIrum SunderjiCathy HaighCathy SmithAlistair WrightEvaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants’ reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick’s model as a framework to map our evaluation of participants’ experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants’ expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.http://www.jeehp.org/upload/jeehp-8-13.pdfMedical educationEducational measurementMedical students
collection DOAJ
language English
format Article
sources DOAJ
author Debra Nestel
Melanie Regan
Priyanga Vijayakumar
Irum Sunderji
Cathy Haigh
Cathy Smith
Alistair Wright
spellingShingle Debra Nestel
Melanie Regan
Priyanga Vijayakumar
Irum Sunderji
Cathy Haigh
Cathy Smith
Alistair Wright
Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
Journal of Educational Evaluation for Health Professions
Medical education
Educational measurement
Medical students
author_facet Debra Nestel
Melanie Regan
Priyanga Vijayakumar
Irum Sunderji
Cathy Haigh
Cathy Smith
Alistair Wright
author_sort Debra Nestel
title Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_short Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_full Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_fullStr Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_full_unstemmed Implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
title_sort implementation of a multi-level evaluation strategy: a case study on a program for international medical graduates
publisher Korea Health Insurance Licensing Examination Institute
series Journal of Educational Evaluation for Health Professions
issn 1975-5937
publishDate 2011-12-01
description Evaluation of educational interventions is often focused on immediate and/or short-term metrics associated with knowledge and/or skills acquisition. We developed an educational intervention to support international medical graduates working in rural Victoria. We wanted an evaluation strategy that included participants’ reactions and considered transfer of learning to the workplace and retention of learning. However, with participants in distributed locations and limited program resources, this was likely to prove challenging. Elsewhere, we have reported the outcomes of this evaluation. In this educational development report, we describe our evaluation strategy as a case study, its underpinning theoretical framework, the strategy, and its benefits and challenges. The strategy sought to address issues of program structure, process, and outcomes. We used a modified version of Kirkpatrick’s model as a framework to map our evaluation of participants’ experiences, acquisition of knowledge and skills, and their application in the workplace. The predominant benefit was that most of the evaluation instruments allowed for personalization of the program. The baseline instruments provided a broad view of participants’ expectations, needs, and current perspective on their role. Immediate evaluation instruments allowed ongoing tailoring of the program to meet learning needs. Intermediate evaluations facilitated insight on the transfer of learning. The principal challenge related to the resource intensive nature of the evaluation strategy. A dedicated program administrator was required to manage data collection. Although resource-intensive, we recommend baseline, immediate, and intermediate data collection points, with multi-source feedback being especially illuminating. We believe our experiences may be valuable to faculty involved in program evaluations.
topic Medical education
Educational measurement
Medical students
url http://www.jeehp.org/upload/jeehp-8-13.pdf
work_keys_str_mv AT debranestel implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT melanieregan implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT priyangavijayakumar implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT irumsunderji implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT cathyhaigh implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT cathysmith implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
AT alistairwright implementationofamultilevelevaluationstrategyacasestudyonaprogramforinternationalmedicalgraduates
_version_ 1725216254487166976