Journal of Surgical Education, 2020
Abstract
OBJECTIVE:
We examined the impact of video editing and rater expertise in surgical resident evaluation on operative performance ratings of surgical trainees.
DESIGN:
Randomized independent review of intraopera- tive video.
SETTING:
Operative video was captured at a single, ter- tiary hospital in Boston, MA.
PARTICIPANTS:
Six common general surgery procedures were video recorded of 6 attending-trainee dyads. Full- length and condensed versions (n = 12 videos) were then reviewed by 13 independent surgeon raters (5 evaluation experts, 8 nonexperts) using a crossed design. Trainee performance was rated using the Operative Performance Rating Scale, System for Improving and Measuring Procedural Learning (SIMPL) Performance scale, the Zwisch scale, and ten Cate scale. These ratings were then standardized before being compared using Bayesian mixed models with raters and videos treated as random effects.
RESULTS:
Editing had no effect on the Operative Performance Rating Scale Overall Performance (-0.10, p = 0.30), SIMPL Performance (0.13, p = 0.71), Zwisch (-0.12, p = 0.27), and ten Cate scale (-0.13, p = 0.29). Addition- ally, rater expertise (evaluation expert vs. nonexpert) had no effect on the same scales (-0.16 (p=0.32), 0.18 (p = 0.74), 0.25 (p = 0.81), and 0.25 (p = 0.17).
CONCLUSIONS:
There is little difference in operative performance assessment scores when raters use condensed videos or when raters who are not experts in surgical resident evaluation are used. Future validation studies of operative performance assessment scales may be facilitated by using nonexpert surgeon raters viewing videos condensed using a standardized protocol.
Link: https://doi.org/10.1016/j.jsurg.2019.12.016
The Society for Improving Medical Professional Learning (SIMPL) is a 501c3 nonprofit research collaborative lead by volunteers who are dedicated to working towards a continuously improving medical education system.