Published on July 7th, 2016 | from CAMH Education
CAMH Education Evaluation Team Wins at #CES2016
It was a win for CAMH Education when Erica Downes and Megha Bhavsar attended the Canadian Evaluation Society’s conference in St. John’s, Newfoundland this June. Their poster presentation Innovative evaluation methodology: using simulation for curriculum improvement won the Social Sciences and Humanities Research Council (SSHRC) Best Poster Session Prize.
We asked Erica, Manager of Evaluation in CAMH Education and Megha, Evaluation Co-ordinator in CAMH Education—who work with support from Latika Nirula, Director of Simulation and Teaching Excellence—to tell us a little more about their proposed evaluation methodology as it was featured on the poster.
We have been working with our colleagues in the Simulation Centre and the Temerty Centre for Therapeutic Brain Intervention, to evaluate their annual electroconvulsive therapy (ECT) training workshop for PGY1 residents.
This program teaches residents how the procedure is performed, how to read and understand EEG recordings and how to obtain informed consent through didactic lectures, observation, and in-situ simulation and debriefing as education methods.
With simulation, there’s an opportunity to evaluate the effectiveness of the program through performance and self-reflection data, and so this year the evaluation team proposed a new methodology that incorporates simulation to the ECT program.
In addition to residents completing the pre- and post-test surveys to assess their knowledge uptake, the evaluation team proposed that residents complete an immediate self-reflection assessment on their performance and experience with the simulation in each of the three training stations: obtaining informed consent, procedural ECT skills, and reading EEGs.
At the end of the training session, residents will watch a video of an ECT procedure with multiple errors. They will need to identify at least five errors. By triangulating the residents’ level of knowledge reported in the pre-and post-tests, self-reflection, and types of errors recognized and missed in the video, there is an opportunity to identify areas of the program that show high outcomes and areas of the program that require improvement.
On May 31, we applied the new methodology to the 2016 ECT training program and we’re currently analyzing the data to evaluate the relationship between the simulation outcomes and experiences, and the educational content of the training program.
With ongoing evaluation of our education programs like the ECT workshop, we can better understand the learning outcomes for participants as well as identify areas for program improvement to ensure our programs are addressing our learners’ needs and that we are delivering high quality education.