Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom
Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. 2013. Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom.
|Authors||Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K.|
Rationale: Driven by developing technology and an ageing population, radiology has witnessed an unprecedented rise in workload. One response to this in the United Kingdom has been to train radiographers to undertake clinical reporting. Accurate interpretation of imaging is crucial to allow clinicians' to correctly manage and treat patients.
Methods: A random sample of cases (n=100) was selected from a consecutive series of 1,000 chest x-ray reports produced by a radiographer in clinical practice using a simple computer generated algorithm. Due to the high level of observer variation which is apparent when interpreting chest x-rays, three consultant radiologists were also included to establish the rate of inter-observer variation between radiologists, which was then used as the baseline. Fifty images were interpreted by each radiologist who examined the radiographer report for accuracy and agreement, with 50% duplication of cases between radiologists to determine inter-radiologist variation. The radiologists performed their evaluation independently and blinded to the proportion of cases receiving multiple radiologist opinions. Inter-observer agreement analysis using Kappa was performed to determine consistency among observers.
Results: Disagreement was found between the radiologist and radiographer in 7 cases, which in three instances showed agreement between one radiologist and the radiographer. Inter-observer agreement (Kappa statistic) between the three radiologists and the reporting radiographer was found to be almost perfect, K=0.91, 95% confidence interval (0.79,1.0), K=0.91 (0.78,1.0) and K=0.83 (0.68,0.99) respectively. Inter-radiologist agreement was also almost perfect, K=0.82 (0.57,1.0) and K=0.91 (0.75,1.0).
Conclusion: The level of inter-observer agreement between radiologist and reporting radiographer chest x-ray interpretation compares favourably with inter-radiologist variation.
|Conference||American Thoracic Society Congress|
|Publication process dates|
|Deposited||16 Sep 2014|
0views this month
0downloads this month