Inter-rater Reliability of a Clinical Documentation Rubric Within Pharmacotherapy Problem-Based Learning Courses

Am J Pharm Educ. 2020 Jul;84(7):ajpe7648. doi: 10.5688/ajpe7648.

Abstract

Objective. To evaluate a clinical documentation rubric for pharmacotherapy problem-based learning (PBL) courses using inter-rater reliability (IRR) among different evaluators. Methods. A rubric was adapted for use in grading student pharmacists' clinical documentation in pharmacotherapy PBL courses. Multiple faculty evaluators used the rubric to assess student pharmacists' clinical documentation. The mean rubric score given by the evaluators and the standard deviation were calculated. Intra-class correlation coefficients (ICC) were calculated to determine the inter-rater reliability (IRR) of the rubric. Results. Three hundred seventeen clinical documentation submissions were scored twice by multiple evaluators using the rubric. The mean initial evaluation score was 9.1 (SD=0.9) and the mean second evaluation score was 9.1 (SD=0.9), with no significant difference found between the two. The overall ICC was 0.7 across multiple graders, indicating good IRR. Conclusion. The clinical documentation rubric demonstrated overall good IRR between multiple evaluators when used in pharmacotherapy PBL courses. The rubric will undergo additional evaluation and continuous quality improvement to ensure that student pharmacists are provided with the formative feedback they need.

Keywords: clinical documentation; evaluation; inter-rater reliability; rubric.

MeSH terms

  • Documentation / standards*
  • Education, Medical, Undergraduate / standards*
  • Education, Pharmacy / methods*
  • Educational Measurement / standards*
  • Faculty / standards
  • Formative Feedback
  • Humans
  • Problem-Based Learning / standards*
  • Reproducibility of Results
  • Students, Pharmacy