Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: review of a single clinical site
Woznitza, N., Piper, K., Burke, S., Ellis, S. and Bothamley, G. 2018. Agreement between expert thoracic radiologists and the chest radiograph reports provided by consultant radiologists and reporting radiographers in clinical practice: review of a single clinical site. Radiography. https://doi.org/10.1016/j.radi.2018.01.009
|Authors||Woznitza, N., Piper, K., Burke, S., Ellis, S. and Bothamley, G.|
Introduction: To compare the clinical chest radiograph (CXR) reports provided by consultant radiologists and reporting radiographers with expert thoracic radiologists.
Methods: Adult CXRs (n=193) from a single site were included; 83% randomly selected from CXRs performed over one year, and 17% selected from the discrepancy meeting. Chest radiographs were independently interpreted by two expert thoracic radiologists (CTR1/2).Clinical history, previous and follow-up imaging was available, but not the original clinical report. Two arbiters compared expert and clinical reports independently. Kappa (Ƙ), Chi Square (χ2) and McNemar tests were performed to determine inter-observer agreement.
Results: CTR1 interpreted 187 (97%) and CTR2 186 (96%) CXRs, with 180 CXRs interpreted by both experts. Radiologists and radiographers provided 93 and 87 of the original clinical reports respectively. Consensus between both expert thoracic radiologists and the radiographer clinical report was 70 (CTR1;Ƙ=0.59) and 70 (CTR2; Ƙ=0.62), and comparable to agreement between expert thoracic radiologists and the radiologist clinical report (CTR1=76,Ƙ=0.60; CTR2=75, Ƙ=0.62). Expert thoracic radiologists agreed in 131 cases (Ƙ=0.48). There was no difference in agreement between either expert thoracic radiologist, when the clinical report was provided by radiographers or radiologists (CTR1 χ=0.056, p=0.813; CTR2 χ=0.014, p=0.906), or when stratified by inter-expert agreement; radiographer McNemar p=0.629 and radiologist p=0.701.
Conclusion: Even when weighted with chest radiographs reviewed at discrepancy meetings, content of CXR reports from trained radiographers are comparable to the content of reports issued by radiologists and expert thoracic radiologists.
|Keywords||Clinical Competence; radiography; thoracic; radiographer reporting; observer performance|
|Digital Object Identifier (DOI)||https://doi.org/10.1016/j.radi.2018.01.009|
|Funder||College of Radiographers|
|Online||18 Feb 2018|
|Publication process dates|
|Deposited||31 Jan 2018|
|Accepted||30 Jan 2018|
|Accepted author manuscript|
0views this month
1downloads this month