The Kappa Paradox Explained

J Hand Surg Am. 2024 May;49(5):482-485. doi: 10.1016/j.jhsa.2024.01.006. Epub 2024 Feb 17.

Abstract

Observer reliability studies for fracture classification systems evaluate agreement using Cohen's κ and absolute agreement as outcome measures. Cohen's κ is a chance-corrected measure of agreement and can range between 0 (no agreement) and 1 (perfect agreement). Absolute agreement is the percentage of times observers agree on the matter they have to rate. Some studies report a high-absolute agreement but a relatively low κ value, which is counterintuitive. This phenomenon is referred to as the Kappa Paradox. The objective of this article was to explain the statistical phenomenon of the Kappa Paradox and to help readers and researchers to recognize and prevent this phenomenon.

Keywords: Cohen; fracture classification; kappa; statistics.

Publication types

  • Review

MeSH terms

  • Fractures, Bone* / classification
  • Humans
  • Observer Variation
  • Reproducibility of Results