Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom
Conference poster
Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. 2013. Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom.
Authors | Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. |
---|---|
Type | Conference poster |
Description | Rationale: Driven by developing technology and an ageing population, radiology has witnessed an unprecedented rise in workload. One response to this in the United Kingdom has been to train radiographers to undertake clinical reporting. Accurate interpretation of imaging is crucial to allow clinicians' to correctly manage and treat patients. Methods: A random sample of cases (n=100) was selected from a consecutive series of 1,000 chest x-ray reports produced by a radiographer in clinical practice using a simple computer generated algorithm. Due to the high level of observer variation which is apparent when interpreting chest x-rays, three consultant radiologists were also included to establish the rate of inter-observer variation between radiologists, which was then used as the baseline. Fifty images were interpreted by each radiologist who examined the radiographer report for accuracy and agreement, with 50% duplication of cases between radiologists to determine inter-radiologist variation. The radiologists performed their evaluation independently and blinded to the proportion of cases receiving multiple radiologist opinions. Inter-observer agreement analysis using Kappa was performed to determine consistency among observers. Results: Disagreement was found between the radiologist and radiographer in 7 cases, which in three instances showed agreement between one radiologist and the radiographer. Inter-observer agreement (Kappa statistic) between the three radiologists and the reporting radiographer was found to be almost perfect, K=0.91, 95% confidence interval (0.79,1.0), K=0.91 (0.78,1.0) and K=0.83 (0.68,0.99) respectively. Inter-radiologist agreement was also almost perfect, K=0.82 (0.57,1.0) and K=0.91 (0.75,1.0). Conclusion: The level of inter-observer agreement between radiologist and reporting radiographer chest x-ray interpretation compares favourably with inter-radiologist variation. |
Year | 2013 |
Conference | American Thoracic Society Congress |
Related URL | http://www.atsjournals.org/doi/abs/10.1164/ajrccm-conference.2013.187.1_MeetingAbstracts.A2229 |
File | |
Publication process dates | |
Deposited | 16 Sep 2014 |
Output status | Published |
Publication dates | |
May 2013 |
https://repository.canterbury.ac.uk/item/871q1/chest-x-ray-interpretation-agreement-between-consultant-radiologists-and-a-reporting-radiographer-in-clinical-practice-in-the-united-kingdom
Download files
338
total views99
total downloads3
views this month0
downloads this month