Fusing correspondenceless 3D point distribution models

Med Image Comput Comput Assist Interv. 2013;16(Pt 1):251-8. doi: 10.1007/978-3-642-40811-3_32.

Abstract

This paper presents a framework for the fusion of multiple point distribution models (PDMs) with unknown point correspondences. With this work, models built from distinct patient groups and imaging modalities can be merged, with the aim to obtain a PDM that encodes a wider range of anatomical variability. To achieve this, two technical challenges are addressed in this work. Firstly, the model fusion must be carried out directly on the corresponding means and eigenvectors as the original data is not always available and cannot be freely exchanged across centers for various legal and practical reasons. Secondly, the PDMs need to be normalized before fusion as the point correspondence is unknown. The proposed framework is validated by integrating statistical models of the left and right ventricles of the heart constructed from different imaging modalities (MRI and CT) and with different landmark representations of the data. The results show that the integration is statistically and anatomically meaningful and that the quality of the resulting model is significantly improved.

MeSH terms

  • Anatomic Landmarks / diagnostic imaging*
  • Anatomic Landmarks / pathology*
  • Computer Simulation
  • Humans
  • Imaging, Three-Dimensional / methods*
  • Magnetic Resonance Imaging / methods
  • Models, Cardiovascular
  • Models, Statistical
  • Multimodal Imaging / methods*
  • Pattern Recognition, Automated / methods*
  • Statistical Distributions
  • Subtraction Technique*
  • Tomography, X-Ray Computed / methods
  • Ventricular Dysfunction / diagnosis*