A New Teaching Tool for Peer Review of Charting and Care in the Emergency Department

Jt Comm J Qual Patient Saf. 2023 Feb;49(2):105-110. doi: 10.1016/j.jcjq.2022.10.007. Epub 2022 Nov 3.

Abstract

Poor documentation, incomplete medical decision-making, missing progress notes, and inappropriate care play a major role in medical malpractice cases. We introduced a new quality improvement (QI) process focused on evaluating and improving documentation and clinical care. We hypothesized that a modified, simplified QI scoring rubric would demonstrate inter-rater reliability among attending physicians and provide a useful new standardized tool for both QI departmental review and peer review. We modified a previously developed rubric template that demonstrated high inter-rater reliability for a more streamlined, simpler, and more generalized application. We developed a new system using three discrete templated sections with choices limited to five options. Eight experienced attending physicians evaluated the same 10 charts using our scoring rubrics. Consistency among raters was assessed using the Shrout-Fleiss relative: fixed set mean kappa scores. Our statistical analysis found excellent consistency among our experienced raters for both the documentation (κ = 0.91) and clinical care (κ = 0.84) scoring tools. We conclude that a modified, simplified QI scoring rubric demonstrates inter-rater reliability among experienced attending physicians. We believe this tool can be used as a standardized tool for a departmental review process by experienced quality leaders as well as by faculty to provide peer review while improving their own charting prowess. We further used this tool for peer review by having the attending staff participate in reviewing a specified number of charts using our modified template with explicit criteria so they could provide feedback as well, while gaining a better understanding of the elements of a "good" chart and of opportunities for improved care and resource utilization. By using this tool, we were able to provide more than 50 attendings summative feedback on their charting by a group of their peers that was both numerical and descriptive.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Documentation
  • Emergency Service, Hospital
  • Health Personnel*
  • Humans
  • Peer Review*
  • Reproducibility of Results