Psychometric validation of the reconstructed version of the assessment of reasoning tool

Med Teach. 2021 Feb;43(2):168-173. doi: 10.1080/0142159X.2020.1830960. Epub 2020 Oct 17.

Abstract

Background: Assessing learners' competence in diagnostic reasoning is challenging and unstandardized in medical education. We developed a theory-informed, behaviorally anchored rubric, the Assessment of Reasoning Tool (ART), with content and response process validity. This study gathered evidence to support the internal structure and the interpretation of measurements derived from this tool.

Methods: We derived a reconstructed version of ART (ART-R) as a 15-item, 5-point Likert scale using the ART domains and descriptors. A psychometric evaluation was performed. We created 18 video variations of learner oral presentations, portraying different performance levels of the ART-R.

Results: 152 faculty viewed two videos and rated the learner globally and then using the ART-R. The confirmatory factor analysis showed a favorable comparative fit index = 0.99, root mean square error of approximation = 0.097, and standardized root mean square residual = 0.026. The five domains, hypothesis-directed information gathering, problem representation, prioritized differential diagnosis, diagnostic evaluation, and awareness of cognitive tendencies/emotional factors, had high internal consistency. The total score for each domain had a positive association with the global assessment of diagnostic reasoning.

Conclusions: Our findings provide validity evidence for the ART-R as an assessment tool with five theoretical domains, internal consistency, and association with global assessment.

Keywords: Clinical reasoning; clinical preceptor; competency-based assessment; diagnostic errors; feedback.

MeSH terms

  • Diagnosis, Differential
  • Education, Medical*
  • Factor Analysis, Statistical
  • Humans
  • Problem Solving*
  • Psychometrics
  • Reproducibility of Results