Cross-Modal Prostate Cancer Segmentation via Self-Attention Distillation

IEEE J Biomed Health Inform. 2022 Nov;26(11):5298-5309. doi: 10.1109/JBHI.2021.3127688. Epub 2022 Nov 10.

Abstract

The automatic and accurate segmentation of the prostate cancer from the multi-modal magnetic resonance images is of prime importance for the disease assessment and follow-up treatment plan. However, how to use the multi-modal image features more efficiently is still a challenging problem in the field of medical image segmentation. In this paper, we develop a cross-modal self-attention distillation network by fully exploiting the encoded information of the intermediate layers from different modalities, and the generated attention maps of different modalities enable the model to transfer significant and discriminative information that contains more details. Moreover, a novel spatial correlated feature fusion module is further employed for learning more complementary correlation and non-linear information of different modality images. We evaluate our model in five-fold cross-validation on 358 MRI images with biopsy confirmed. Without bells and whistles, our proposed network achieves state-of-the-art performance on extensive experiments.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Attention
  • Distillation*
  • Humans
  • Image Processing, Computer-Assisted / methods
  • Magnetic Resonance Imaging / methods
  • Male
  • Prostatic Neoplasms* / diagnostic imaging