Autosegmentation of Prostate Zones and Cancer Regions from Biparametric Magnetic Resonance Images by Using Deep-Learning-Based Neural Networks

Sensors (Basel). 2021 Apr 12;21(8):2709. doi: 10.3390/s21082709.

Abstract

The accuracy in diagnosing prostate cancer (PCa) has increased with the development of multiparametric magnetic resonance imaging (mpMRI). Biparametric magnetic resonance imaging (bpMRI) was found to have a diagnostic accuracy comparable to mpMRI in detecting PCa. However, prostate MRI assessment relies on human experts and specialized training with considerable inter-reader variability. Deep learning may be a more robust approach for prostate MRI assessment. Here we present a method for autosegmenting the prostate zone and cancer region by using SegNet, a deep convolution neural network (DCNN) model. We used PROSTATEx dataset to train the model and combined different sequences into three channels of a single image. For each subject, all slices that contained the transition zone (TZ), peripheral zone (PZ), and PCa region were selected. The datasets were produced using different combinations of images, including T2-weighted (T2W) images, diffusion-weighted images (DWI) and apparent diffusion coefficient (ADC) images. Among these groups, the T2W + DWI + ADC images exhibited the best performance with a dice similarity coefficient of 90.45% for the TZ, 70.04% for the PZ, and 52.73% for the PCa region. Image sequence analysis with a DCNN model has the potential to assist PCa diagnosis.

Keywords: ADC; DCNN; DWI; SegNet; T2W; encoder–decoder architecture; zonal segmentation.

MeSH terms

  • Deep Learning*
  • Diffusion Magnetic Resonance Imaging
  • Humans
  • Magnetic Resonance Imaging
  • Male
  • Multiparametric Magnetic Resonance Imaging*
  • Neural Networks, Computer
  • Prostate / diagnostic imaging
  • Prostatic Neoplasms* / diagnostic imaging