Development of a Standardized Kalamazoo Communication Skills Assessment Tool for Radiologists: Validation, Multisource Reliability, and Lessons Learned
Authors
Brown, Stephen D.Rider, Elizabeth A.
Jamieson, Katherine
Meyer, Elaine C.
Callahan, Michael J.
Debenedectis, Carolynn M.
Bixby, Sarah D.
Walters, Michele
Forman, Sara F.
Varrin, Pamela H.
Forbes, Peter
Roussin, Christopher J.
UMass Chan Affiliations
Department of RadiologyDocument Type
Journal ArticlePublication Date
2017-08-01Keywords
communication assessmentcommunication competency
communication education
radiology
simulation
Medical Education
Radiology
Metadata
Show full item recordAbstract
OBJECTIVE: The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology. MATERIALS AND METHODS: The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors. RESULTS: The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (chi26 = 1186; p < 0.0001) and varied significantly by viewing order, rater type, and rater sex. CONCLUSION: The adapted communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.Source
AJR Am J Roentgenol. 2017 Aug;209(2):351-357. doi: 10.2214/AJR.16.17439. Epub 2017 May 24. Link to article on publisher's site
DOI
10.2214/AJR.16.17439Permanent Link to this Item
http://hdl.handle.net/20.500.14038/48201PubMed ID
28537754Related Resources
ae974a485f413a2113503eed53cd6c53
10.2214/AJR.16.17439