Multi-modality relation attention network for breast tumor classification

Comput Biol Med. 2022 Nov:150:106210. doi: 10.1016/j.compbiomed.2022.106210. Epub 2022 Oct 12.

Abstract

Automatic breast image classification plays an important role in breast cancer diagnosis, and multi-modality image fusion may improve classification performance. However, existing fusion methods ignore relevant multi-modality information in favor of improving the discriminative ability of single-modality features. To improve classification performance, this paper proposes a multi-modality relation attention network with consistent regularization for breast tumor classification using diffusion-weighted imaging (DWI) and apparent dispersion coefficient (ADC) images. Within the proposed network, a novel multi-modality relation attention module improves the discriminative ability of single-modality features by exploring the correlation information between two modalities. In addition, a module ensures the classification consistency of ADC and DWI modality, thus improving robustness to noise. Experimental results on our database demonstrate that the proposed method is effective for breast tumor classification, and outperforms existing multi-modality fusion methods. The AUC, accuracy, specificity, and sensitivity are 85.1%, 86.7%, 83.3%, and 88.9% respectively.

Keywords: Breast cancer; Deep learning; Medical image classification; Multi-modality fusion; Relation learning.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Breast
  • Breast Neoplasms* / diagnostic imaging
  • Diffusion Magnetic Resonance Imaging / methods
  • Female
  • Humans
  • Magnetic Resonance Imaging / methods
  • Mammary Neoplasms, Animal*