Interobserver Reliability in Imaging-Based Fracture Union Assessment-Two Systematic Reviews

J Orthop Trauma. 2020 Jan;34(1):e31-e37. doi: 10.1097/BOT.0000000000001599.

Abstract

Objectives: (A) To investigate the specialty of observers involved in imaging-based assessment of bone fracture union in recent orthopaedic trials and (B) to provide a general overview of observer differences (in terms of interobserver reliability) in radiologic fracture union assessment that have been reported between surgeons and radiologists.

Data sources: Two separate systematic reviews (A, B) of English-, German-, and French-language articles in MEDLINE and Embase databases using the Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) guidelines were done, with the following time frames: (A) January 2016-August 2017 and (B) through November 2017.

Study selection: (A) Clinical trials of surgical fracture treatment evaluating radiologic (non) union. (B) Interobserver studies reporting kappa-values or intraclass correlation coefficients as reliability coefficient for radiologic fracture union assessment. Inclusion criteria for both reviews were fractures of the appendicular skeleton and the use of radiographs or computed tomography.

Data extraction: Data were independently retrieved by 2 reviewers.

Data synthesis: Descriptive statistics and percentages were reported.

Results: (A) Forty-eight trials were included, whereof 33 (68%) did not report the observer's specialty. Six trials (13%) reported surgeon observers only, and 6 (13%) reported radiologist observers only. The median number of observers is 1 (interquartile range, 1-2). (B) Thirty-one interobserver studies were included, whereof 11 (35%) included at least 1 surgeon and 1 radiologist. Interobserver reliability varied considerably across the various fracture types studied and outcome scale used and was often unsatisfactory (kappa or intraclass correlation coefficients of <0.7).

Conclusions: In most trials providing observer's characteristics, radiologic fracture union was either rated by 1 surgeon or 1 radiologist. As interobserver reliability can be unsatisfactory, we recommend surgeons and radiologists to further intensify collaboration and trials to include at least 2 observers and associated reliability statistics.

Publication types

  • Systematic Review

MeSH terms

  • Fractures, Bone* / diagnostic imaging
  • Fractures, Bone* / surgery
  • Humans
  • Observer Variation
  • Radiography
  • Reproducibility of Results
  • Tomography, X-Ray Computed