AI-clinician collaboration via disagreement prediction: A decision pipeline and retrospective analysis of real-world radiologist-AI interactions

Cell Rep Med. 2023 Oct 17;4(10):101207. doi: 10.1016/j.xcrm.2023.101207. Epub 2023 Sep 27.

Abstract

Clinical decision support tools can improve diagnostic performance or reduce variability, but they are also subject to post-deployment underperformance. Although using AI in an assistive setting offsets many concerns with autonomous AI in medicine, systems that present all predictions equivalently fail to protect against key AI safety concerns. We design a decision pipeline that supports the diagnostic model with an ecosystem of models, integrating disagreement prediction, clinical significance categorization, and prediction quality modeling to guide prediction presentation. We characterize disagreement using data from a deployed chest X-ray interpretation aid and compare clinician burden in this proposed pipeline to the diagnostic model in isolation. The average disagreement rate is 6.5%, and the expected burden reduction is 4.8%, even if 5% of disagreements on urgent findings receive a second read. We conclude that, in our production setting, we can adequately balance risk mitigation with clinician burden if disagreement false positives are reduced.

Keywords: AI safety; artificial intelligence; clinical decision support; clinician workload estimation; computer-aided diagnosis; disagreement prediction; human-AI collaboration; machine learning; radiology.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Artificial Intelligence*
  • Clinical Relevance
  • Humans
  • Medicine
  • Radiologists*
  • Retrospective Studies