The Development of Expertise in Radiology: In Chest Radiograph Interpretation, "Expert" Search Pattern May Predate "Expert" Levels of Diagnostic Accuracy for Pneumothorax Identification

Radiology. 2016 Jul;280(1):252-60. doi: 10.1148/radiol.2016150409. Epub 2016 Jan 27.

Abstract

Purpose To investigate the development of chest radiograph interpretation skill through medical training by measuring both diagnostic accuracy and eye movements during visual search. Materials and Methods An institutional exemption from full ethical review was granted for the study. Five consultant radiologists were deemed the reference expert group, and four radiology registrars, five senior house officers (SHOs), and six interns formed four clinician groups. Participants were shown 30 chest radiographs, 14 of which had a pneumothorax, and were asked to give their level of confidence as to whether a pneumothorax was present. Receiver operating characteristic (ROC) curve analysis was carried out on diagnostic decisions. Eye movements were recorded with a Tobii TX300 (Tobii Technology, Stockholm, Sweden) eye tracker. Four eye-tracking metrics were analyzed. Variables were compared to identify any differences between groups. All data were compared by using the Friedman nonparametric method. Results The average area under the ROC curve for the groups increased with experience (0.947 for consultants, 0.792 for registrars, 0.693 for SHOs, and 0.659 for interns; P = .009). A significant difference in diagnostic accuracy was found between consultants and registrars (P = .046). All four eye-tracking metrics decreased with experience, and there were significant differences between registrars and SHOs. Total reading time decreased with experience; it was significantly lower for registrars compared with SHOs (P = .046) and for SHOs compared with interns (P = .025). Conclusion Chest radiograph interpretation skill increased with experience, both in terms of diagnostic accuracy and visual search. The observed level of experience at which there was a significant difference was higher for diagnostic accuracy than for eye-tracking metrics. (©) RSNA, 2016 Online supplemental material is available for this article.

MeSH terms

  • Clinical Competence / statistics & numerical data*
  • Humans
  • Image Interpretation, Computer-Assisted / standards*
  • Pneumothorax / diagnostic imaging*
  • ROC Curve
  • Radiography, Thoracic / standards*
  • Radiologists / standards*
  • Radiology / standards
  • Reproducibility of Results