Competency-based education calls for programmatic assessment: But what does this look like in practice?

J Eval Clin Pract. 2020 Aug;26(4):1087-1095. doi: 10.1111/jep.13328. Epub 2019 Dec 9.

Abstract

Rationale, aims, and objectives: Programmatic assessment has been identified as a system-oriented approach to achieving the multiple purposes for assessment within Competency-Based Medical Education (CBME, i.e., formative, summative, and program improvement). While there are well-established principles for designing and evaluating programs of assessment, few studies illustrate and critically interpret, what a system of programmatic assessment looks like in practice. This study aims to use systems thinking and the 'two communities' metaphor to interpret a model of programmatic assessment and to identify challenges and opportunities with operationalization.

Method: An interpretive case study was used to investigate how programmatic assessment is being operationalized within one competency-based residency program at a Canadian university. Qualitative data were collected from residents, faculty, and program leadership via semi-structured group and individual interviews conducted at nine months post-CBME implementation. Data were analyzed using a combination of data-based inductive analysis and theory-derived deductive analysis.

Results: In this model, Academic Advisors had a central role in brokering assessment data between communities responsible for producing and using residents' performance information for decision making (i.e., formative, summative/evaluative, and program improvement). As system intermediaries, Academic Advisors were in a privileged position to see how the parts of the assessment system contributed to the functioning of the whole and could identify which system components were not functioning as intended. Challenges were identified with the documentation of residents' performance information (i.e., system inputs); use of low-stakes formative assessments to inform high-stakes evaluative judgments about the achievement of competence standards; and gaps in feedback mechanisms for closing learning loops.

Conclusions: The findings of this research suggest that program stakeholders can benefit from a systems perspective regarding how their assessment practices contribute to the efficacy of the system as a whole. Academic Advisors are well positioned to support educational development efforts focused on overcoming challenges with operationalizing programmatic assessment.

Keywords: case study; competency-based medical education; programmatic assessment; qualitative; system of assessment; systems thinking.

MeSH terms

  • Canada
  • Clinical Competence
  • Competency-Based Education*
  • Feedback
  • Humans
  • Internship and Residency*
  • Learning