Show simple item record

dc.contributor.authorBrown, Stephen D.
dc.contributor.authorRider, Elizabeth A.
dc.contributor.authorJamieson, Katherine
dc.contributor.authorMeyer, Elaine C.
dc.contributor.authorCallahan, Michael J.
dc.contributor.authorDeBenedectis, Carolynn M
dc.contributor.authorBixby, Sarah D.
dc.contributor.authorWalters, Michele
dc.contributor.authorForman, Sara F.
dc.contributor.authorVarrin, Pamela H.
dc.contributor.authorForbes, Peter
dc.contributor.authorRoussin, Christopher J.
dc.date2022-08-11T08:10:47.000
dc.date.accessioned2022-08-23T17:19:58Z
dc.date.available2022-08-23T17:19:58Z
dc.date.issued2017-08-01
dc.date.submitted2017-06-19
dc.identifier.citation<p>AJR Am J Roentgenol. 2017 Aug;209(2):351-357. doi: 10.2214/AJR.16.17439. Epub 2017 May 24. <a href="https://doi.org/10.2214/AJR.16.17439">Link to article on publisher's site</a></p>
dc.identifier.issn0361-803X (Linking)
dc.identifier.doi10.2214/AJR.16.17439
dc.identifier.pmid28537754
dc.identifier.urihttp://hdl.handle.net/20.500.14038/48201
dc.description.abstractOBJECTIVE: The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology. MATERIALS AND METHODS: The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors. RESULTS: The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (chi26 = 1186; p < 0.0001) and varied significantly by viewing order, rater type, and rater sex. CONCLUSION: The adapted communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.
dc.language.isoen_US
dc.relation<p><a href="http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=pubmed&cmd=Retrieve&list_uids=28537754&dopt=Abstract">Link to Article in PubMed</a></p>
dc.relation.urlhttps://doi.org/10.2214/AJR.16.17439
dc.subjectcommunication assessment
dc.subjectcommunication competency
dc.subjectcommunication education
dc.subjectradiology
dc.subjectsimulation
dc.subjectMedical Education
dc.subjectRadiology
dc.titleDevelopment of a Standardized Kalamazoo Communication Skills Assessment Tool for Radiologists: Validation, Multisource Reliability, and Lessons Learned
dc.typeJournal Article
dc.source.journaltitleAJR. American journal of roentgenology
dc.identifier.legacycoverpagehttps://escholarship.umassmed.edu/radiology_pubs/316
dc.identifier.contextkey10320223
html.description.abstract<p>OBJECTIVE: The purpose of this study was to develop and test a standardized communication skills assessment instrument for radiology.</p> <p>MATERIALS AND METHODS: The Delphi method was used to validate the Kalamazoo Communication Skills Assessment instrument for radiology by revising and achieving consensus on the 43 items of the preexisting instrument among an interdisciplinary team of experts consisting of five radiologists and four nonradiologists (two men, seven women). Reviewers assessed the applicability of the instrument to evaluation of conversations between radiology trainees and trained actors portraying concerned parents in enactments about bad news, radiation risks, and diagnostic errors that were video recorded during a communication workshop. Interrater reliability was assessed by use of the revised instrument to rate a series of enactments between trainees and actors video recorded in a hospital-based simulator center. Eight raters evaluated each of seven different video-recorded interactions between physicians and parent-actors.</p> <p>RESULTS: The final instrument contained 43 items. After three review rounds, 42 of 43 (98%) items had an average rating of relevant or very relevant for bad news conversations. All items were rated as relevant or very relevant for conversations about error disclosure and radiation risk. Reliability and rater agreement measures were moderate. The intraclass correlation coefficient range was 0.07-0.58; mean, 0.30; SD, 0.13; and median, 0.30. The range of weighted kappa values was 0.03-0.47; mean, 0.23; SD, 0.12; and median, 0.22. Ratings varied significantly among conversations (chi26 = 1186; p < 0.0001) and varied significantly by viewing order, rater type, and rater sex.</p> <p>CONCLUSION: The adapted communication skills assessment instrument is highly relevant for radiology, having moderate interrater reliability. These findings have important implications for assessing the relational competencies of radiology trainees.</p>
dc.identifier.submissionpathradiology_pubs/316
dc.contributor.departmentDepartment of Radiology
dc.source.pages1-7


This item appears in the following Collection(s)

Show simple item record