[Do artificial intelligence systems reason in the same way as clinicians when making diagnoses?]

Rev Med Interne. 2020 Mar;41(3):192-195. doi: 10.1016/j.revmed.2019.12.014. Epub 2020 Jan 25.
[Article in French]

Abstract

Clinical reasoning is at the heart of physicians' competence, as it allows them to make diagnoses. However, diagnostic errors are common, due to the existence of reasoning biases. Artificial intelligence is undergoing unprecedented development in this context. It is increasingly seen as a solution to improve the diagnostic performance of physicians, or even to perform this task for them, in a totally autonomous and more efficient way. In order to understand the challenges associated with the development of artificial intelligence, it is important to understand how the machine works to make diagnoses, what are the similarities and differences with the physician's diagnostic reasoning, and what are the consequences for medical training and practice.

Keywords: Artificial intelligence; Biais cognitifs; Clinical reasoning; Cognitive biases; Deep neural networks; Intelligence artificielle; Intuition; Raisonnement clinique; Réseaux de neurones.

Publication types

  • Review

MeSH terms

  • Artificial Intelligence*
  • Clinical Reasoning*
  • Decision Making / physiology
  • Diagnosis, Computer-Assisted* / psychology
  • Diagnosis, Computer-Assisted* / standards
  • Diagnosis, Computer-Assisted* / statistics & numerical data
  • Diagnostic Errors / psychology
  • Diagnostic Errors / statistics & numerical data
  • Diagnostic Techniques and Procedures* / psychology
  • Diagnostic Techniques and Procedures* / standards
  • Diagnostic Techniques and Procedures* / statistics & numerical data
  • Humans
  • Intuition / physiology
  • Physicians / psychology*
  • Physicians / statistics & numerical data
  • Practice Patterns, Physicians' / standards
  • Practice Patterns, Physicians' / statistics & numerical data
  • Prejudice / psychology