The Use of Artificial Intelligence for the Classification of Craniofacial Deformities

J Clin Med. 2023 Nov 14;12(22):7082. doi: 10.3390/jcm12227082.

Abstract

Positional cranial deformities are a common finding in toddlers, yet differentiation from craniosynostosis can be challenging. The aim of this study was to train convolutional neural networks (CNNs) to classify craniofacial deformities based on 2D images generated using photogrammetry as a radiation-free imaging technique. A total of 487 patients with photogrammetry scans were included in this retrospective cohort study: children with craniosynostosis (n = 227), positional deformities (n = 206), and healthy children (n = 54). Three two-dimensional images were extracted from each photogrammetry scan. The datasets were divided into training, validation, and test sets. During the training, fine-tuned ResNet-152s were utilized. The performance was quantified using tenfold cross-validation. For the detection of craniosynostosis, sensitivity was at 0.94 with a specificity of 0.85. Regarding the differentiation of the five existing classes (trigonocephaly, scaphocephaly, positional plagiocephaly left, positional plagiocephaly right, and healthy), sensitivity ranged from 0.45 (positional plagiocephaly left) to 0.95 (scaphocephaly) and specificity ranged from 0.87 (positional plagiocephaly right) to 0.97 (scaphocephaly). We present a CNN-based approach to classify craniofacial deformities on two-dimensional images with promising results. A larger dataset would be required to identify rarer forms of craniosynostosis as well. The chosen 2D approach enables future applications for digital cameras or smartphones.

Keywords: artificial intelligence; congenital abnormalities; craniosynostoses; deep learning; photogrammetry; trigonocephaly.

Grants and funding

This work has been funded by the HeiKa strategic partnership of the University of Heidelberg and Karlsruhe for the period of 01/2020–12/2020 as part of the research bridge “Medical Technology for Health” under the research grant HEiKA_19-17.