A randomized clinical trial found that reviewing crowdsourced feedback may have improved internal medicine and family medicine residents' communication skills when disclosing medical errors, especially among those without prior experience.
in the study, published in JAMA Network Open, researchers included 146 second-year residents (mean age = 29 years, 41.0% female) from seven U.S. academic medical centers. The participants completed cases in a video-based communication assessment (VCA) tool at baseline and 4 weeks later, with the intervention group receiving crowdsourced feedback before the second assessment.
At follow-up, the intervention group outperformed controls on a 5-point rating scale (mean score = 3.26 vs 3.14, difference = 0.12 points, 95% confidence interval [CI] = 0.08–0.48, P = .01). The effect was most pronounced among the 47.0% of residents with no prior error disclosure experience (mean score = 3.33 vs 3.09, difference = 0.24 points, 95% CI = 0.01–0.48, P = .007).
Feedback was provided by 405 crowdsourced raters (32.0% excluded for inattentiveness), with each response rated by a mean of 9.50 laypeople on measures of accountability, honesty, apology, empathy, and caring. The reports included overall performance ratings, average peer scores, and learning points from laypeople's comments.
Survey data revealed that just 54.9% of the residents in the intervention group reviewed their feedback, with 13.7% spending less than 5 minutes and 2.0% spending 21 to 25 minutes. Listening to exemplar peer responses was more common than replaying their own responses.
Logistic regression showed that a 1-unit increase in baseline VCA scores predicted a 2.89-fold increase in the odds of study completion (95% CI = 1.06–7.84, P = .04).
The researchers noted that the VCA's effectiveness may stem from its incorporation of deliberate practice elements such as learning points that reinforce desired behaviors, exemplars that model ideal performance, and comparative ratings for self-assessment.
Limitations of the study included potential recall and social desirability bias as well as limited diversity among raters. However, the study's size, geographic scope, and rigorous design supported the generalizability of its findings.
The researchers called for future research to identify barriers to feedback use, assess the VCA's potential for evaluating milestone attainment, and examine its impact on patient-reported outcomes.
"The VCA has the potential to solve a widely unmet need for graduate medical education patient safety educators," concluded the study authors.
Conflict of interest disclosures can be found in the study.