Evaluating the accuracy of automated cephalometric analysis based on artificial intelligence

BMC Oral Health. 2023 Apr 1;23(1):191. doi: 10.1186/s12903-023-02881-8.

Abstract

Background: The purpose of this study was to evaluate the accuracy of automatic cephalometric landmark localization and measurements using cephalometric analysis via artificial intelligence (AI) compared with computer-assisted manual analysis.

Methods: Reconstructed lateral cephalograms (RLCs) from cone-beam computed tomography (CBCT) in 85 patients were selected. Computer-assisted manual analysis (Dolphin Imaging 11.9) and AI automatic analysis (Planmeca Romexis 6.2) were used to locate 19 landmarks and obtain 23 measurements. Mean radial error (MRE) and successful detection rate (SDR) values were calculated to assess the accuracy of automatic landmark digitization. Paired t tests and Bland‒Altman plots were used to compare the differences and consistencies in cephalometric measurements between manual and automatic analysis programs.

Results: The MRE for 19 cephalometric landmarks was 2.07 ± 1.35 mm with the automatic program. The average SDR within 1 mm, 2 mm, 2.5 mm, 3 and 4 mm were 18.82%, 58.58%, 71.70%, 82.04% and 91.39%, respectively. Soft tissue landmarks (1.54 ± 0.85 mm) had the most consistency, while dental landmarks (2.37 ± 1.55 mm) had the most variation. In total, 15 out of 23 measurements were within the clinically acceptable level of accuracy, 2 mm or 2°. The rates of consistency within the 95% limits of agreement were all above 90% for all measurement parameters.

Conclusion: Automatic analysis software collects cephalometric measurements almost effectively enough to be acceptable in clinical work. Nevertheless, automatic cephalometry is not capable of completely replacing manual tracing. Additional manual supervision and adjustment for automatic programs can increase accuracy and efficiency.

Keywords: Artificial intelligent; Automatic identification; Cephalometric analysis; Cone-beam CT.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Artificial Intelligence*
  • Cephalometry / methods
  • Cone-Beam Computed Tomography / methods
  • Imaging, Three-Dimensional / methods
  • Radiography
  • Reproducibility of Results
  • Software*