DBU-Net: Dual branch U-Net for tumor segmentation in breast ultrasound images

PLoS One. 2023 Nov 6;18(11):e0293615. doi: 10.1371/journal.pone.0293615. eCollection 2023.

Abstract

Breast ultrasound medical images often have low imaging quality along with unclear target boundaries. These issues make it challenging for physicians to accurately identify and outline tumors when diagnosing patients. Since precise segmentation is crucial for diagnosis, there is a strong need for an automated method to enhance the segmentation accuracy, which can serve as a technical aid in diagnosis. Recently, the U-Net and its variants have shown great success in medical image segmentation. In this study, drawing inspiration from the U-Net concept, we propose a new variant of the U-Net architecture, called DBU-Net, for tumor segmentation in breast ultrasound images. To enhance the feature extraction capabilities of the encoder, we introduce a novel approach involving the utilization of two distinct encoding paths. In the first path, the original image is employed, while in the second path, we use an image created using the Roberts edge filter, in which edges are highlighted. This dual branch encoding strategy helps to extract the semantic rich information through a mutually informative learning process. At each level of the encoder, both branches independently undergo two convolutional layers followed by a pooling layer. To facilitate cross learning between the branches, a weighted addition scheme is implemented. These weights are dynamically learned by considering the gradient with respect to the loss function. We evaluate the performance of our proposed DBU-Net model on two datasets, namely BUSI and UDIAT, and our experimental results demonstrate superior performance compared to state-of-the-art models.

MeSH terms

  • Animals
  • Cognition
  • Female
  • Humans
  • Image Processing, Computer-Assisted
  • Learning
  • Mammary Neoplasms, Animal*
  • Ultrasonography
  • Ultrasonography, Mammary*

Grants and funding

The author(s) received no specific funding for this work.