Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom
Conference poster
Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. 2013. Chest X-Ray interpretation: agreement between consultant radiologists and a reporting radiographer in clinical practice in the United Kingdom.
Authors | Woznitza, N., Piper, K., Burke, S., Patel, K., Amin, S. and Grayson, K. |
---|---|
Type | Conference poster |
Description | Rationale: Driven by developing technology and an ageing population, radiology has witnessed an unprecedented rise in workload. One response to this in the United Kingdom has been to train radiographers to undertake clinical reporting. Accurate interpretation of imaging is crucial to allow clinicians' to correctly manage and treat patients. Methods: A random sample of cases (n=100) was selected from a consecutive series of 1,000 chest x-ray reports produced by a radiographer in clinical practice using a simple computer generated algorithm. Due to the high level of observer variation which is apparent when interpreting chest x-rays, three consultant radiologists were also included to establish the rate of inter-observer variation between radiologists, which was then used as the baseline. Fifty images were interpreted by each radiologist who examined the radiographer report for accuracy and agreement, with 50% duplication of cases between radiologists to determine inter-radiologist variation. The radiologists performed their evaluation independently and blinded to the proportion of cases receiving multiple radiologist opinions. Inter-observer agreement analysis using Kappa was performed to determine consistency among observers. Results: Disagreement was found between the radiologist and radiographer in 7 cases, which in three instances showed agreement between one radiologist and the radiographer. Inter-observer agreement (Kappa statistic) between the three radiologists and the reporting radiographer was found to be almost perfect, K=0.91, 95% confidence interval (0.79,1.0), K=0.91 (0.78,1.0) and K=0.83 (0.68,0.99) respectively. Inter-radiologist agreement was also almost perfect, K=0.82 (0.57,1.0) and K=0.91 (0.75,1.0). Conclusion: The level of inter-observer agreement between radiologist and reporting radiographer chest x-ray interpretation compares favourably with inter-radiologist variation. |
Year | 2013 |
Conference | American Thoracic Society Congress |
Related URL | http://www.atsjournals.org/doi/abs/10.1164/ajrccm-conference.2013.187.1_MeetingAbstracts.A2229 |
File | |
Publication process dates | |
Deposited | 16 Sep 2014 |
Output status | Published |
Publication dates | |
May 2013 |
https://repository.canterbury.ac.uk/item/871q1/chest-x-ray-interpretation-agreement-between-consultant-radiologists-and-a-reporting-radiographer-in-clinical-practice-in-the-united-kingdom
Download files
360
total views101
total downloads5
views this month0
downloads this month
Export as
Related outputs
An implementation facilitation intervention to improve the musculoskeletal X‑ray reporting by radiographers across London
Lockwood, P., Burton, C., Shaw, T., Woznitza, N., Compton, E., Groombridge, H., Hayes, N., Mane, U., O'Brien, A. and Patterson, S. 2025. An implementation facilitation intervention to improve the musculoskeletal X‑ray reporting by radiographers across London. BMC Health Services Research. 25 (248), p. 1. https://doi.org/10.1186/s12913-025-12356-xAccuracy of interpretation of nasogastric tube position on chest radiographs by diagnostic radiographers: A multi-case, multi-reader study
Creeden, A., McFadden, S., Ather, S. and Woznitza, N. 2025. Accuracy of interpretation of nasogastric tube position on chest radiographs by diagnostic radiographers: A multi-case, multi-reader study. Radiography. 31 (1), pp. 83-88. https://doi.org/10.1016/j.radi.2024.10.022Achieving earlier diagnosis of symptomatic lung cancer
Bradley, S., Baldwin, D., Bhartia, B., Black, G., Callister, Matthew Ej, Clayton, Karen, Eccles, Sinan R, Evison, Matthew, Fox, Jesme, Hamilton, W., Konya, J., Lee, Richard W, Bradley, S., Navani, Neal, Noble, Ben, Quaife, Samantha L, Randle, Amelia, Rawlinson, Janette, Richards, Michael, Woznitza, Nick and O'Dowd, Emma 2024. Achieving earlier diagnosis of symptomatic lung cancer. The British Journal of General Practice : The Journal of the Royal College of General Practitioners. 75 (750), pp. 40-43. https://doi.org/10.3399/bjgp25X740493Artificial intelligence (AI) for paediatric fracture detection: a multireader multicase (MRMC) study protocol.
Shelmerdine, S., Pauling, Cato, Allan, Emma, Langan, Dean, Ashworth, Emily, Yung, Ka-Wai, Barber, Joy, Haque, Saira, Rosewarne, David, Woznitza, N., Ather, S., Novak, A., Theivendran, Kanthan and Arthurs, O. 2024. Artificial intelligence (AI) for paediatric fracture detection: a multireader multicase (MRMC) study protocol. BMJ Open. 14 (12), p. e084448. https://doi.org/10.1136/bmjopen-2024-084448Evaluating the impact of artificial intelligence-assisted image analysis on the diagnostic accuracy of front-line clinicians in detecting fractures on plain X-rays (FRACT-AI): protocol for a prospective observational study.
Novak, A., Hollowday, Max, Espinosa Morgado, A., Oke, Jason, Shelmerdine, S., Woznitza, N., Metcalfe, David, Costa, Matthew L, Wilson, S., Kiam, Jian Shen, Vaz, J., Limphaibool, N., Ventre, Jeanne, Jones, Daniel, Greenhalgh, Lois, Gleeson, Fergus, Welch, Nick, Mistry, Alpesh, Devic, Natasa, Teh, James and Ather, S. 2024. Evaluating the impact of artificial intelligence-assisted image analysis on the diagnostic accuracy of front-line clinicians in detecting fractures on plain X-rays (FRACT-AI): protocol for a prospective observational study. BMJ Open. 14 (9), p. e086061. https://doi.org/10.1136/bmjopen-2024-086061A survey of the NHS reporting radiographer workforce in England
Lockwood, P., Burton, C., Shaw, T. and Woznitza, N. 2024. A survey of the NHS reporting radiographer workforce in England. Radiography Open. 10 (1), pp. 1-18. https://doi.org/10.7577/radopen.5635Reporting radiographers within the European Federation of Radiographer Society (EFRS) member countries - motivation for becoming a reporting radiographer.
Jensen, J, Blackburn, P A, Gale, N, Senior, C, Woznitza, N, Heales, C J and Pedersen, M R V 2024. Reporting radiographers within the European Federation of Radiographer Society (EFRS) member countries - motivation for becoming a reporting radiographer. Radiography. 30 (3), pp. 731-736. https://doi.org/S1078-8174(24)00055-5AI assisted reader evaluation in acute CT head interpretation (AI-REACT): protocol for a multireader multicase study
Howell Fu, Alex Novak, Dennis Robert, Shamie Kumar, Swetha Tanamala, Jason Oke, Kanika Bhatia, Ruchir Shah, Andrea Romsauerova, Tilak Das, Abdalá Espinosa, Mariusz Tadeusz Grzeda, Mariapaola Narbone, Rahul Dharmadhikari, Mark Harrison, Kavitha Vimalesvaran, Jane Gooch, Nicholas Woznitza, Nabeeha Salik, Alan Campbell, Farhaan Khan, David J Lowe, Haris Shuaib and Sarim Ather 2024. AI assisted reader evaluation in acute CT head interpretation (AI-REACT): protocol for a multireader multicase study. BMJ Open. 14 (2), p. e079824. https://doi.org/10.1136/bmjopen-2023-079824Reporting radiographers in Europe survey: An overview of the role within the European Federation of Radiographer Society (EFRS) member countries
Pedersen, M.R. V., Jensen, J., Senior, C., Gale, N., Heales, C. J. and Woznitza, N. 2023. Reporting radiographers in Europe survey: An overview of the role within the European Federation of Radiographer Society (EFRS) member countries. Radiography. 29 (6), pp. 1100-1107. https://doi.org/10.1016/j.radi.2023.09.005Assessing the barriers and enablers to the implementation of the diagnostic radiographer musculoskeletal X‑ray reporting service within the NHS in England: a systematic literature review
Lockwood, P., Burton, C., Woznitza, N. and Shaw, T. 2023. Assessing the barriers and enablers to the implementation of the diagnostic radiographer musculoskeletal X‑ray reporting service within the NHS in England: a systematic literature review. BMC Health Services Research. 23 (1270), pp. 1-41. https://doi.org/10.1186/s12913-023-10161-y