Stain color translation of multi-domain OSCC histopathology images using attention gated cGAN

Comput Med Imaging Graph. 2023 Jun:106:102202. doi: 10.1016/j.compmedimag.2023.102202. Epub 2023 Feb 24.

Abstract

Oral Squamous Cell Carcinoma (OSCC) is the most prevalent type of oral cancer across the globe. Histopathology examination is the gold standard for OSCC examination, where stained histopathology slides help in studying and analyzing the cell structures under a microscope to determine the stages and grading of OSCC. One of the staining methods popularly known as H&E staining is used to produce differential coloration, highlight key tissue features, and improve contrast, which makes cell analysis easier. However, the stained H&E histopathology images exhibit inter and intra-variation due to staining techniques, incubation times, and staining reagents. These variations negatively impact computer-aided diagnosis (CAD) and Machine learning algorithm's accuracy and development. A pre-processing procedure called stain normalization must be employed to reduce stain variance's negative impacts. Numerous state-of-the-art stain normalization methods are introduced. However, a robust multi-domain stain normalization approach is still required because, in a real-world situation, the OSCC histopathology images will include more than two color variations involving several domains. In this paper, a multi-domain stain translation method is proposed. The proposed method is an attention gated generator based on a Conditional Generative Adversarial Network (cGAN) with a novel objective function to enforce color distribution and the perpetual resemblance between the source and target domains. Instead of using WSI scanner images like previous techniques, the proposed method is experimented on OSCC histopathology images obtained by several conventional microscopes coupled with cameras. The proposed method receives the L* channel from the L*a*b* color space in inference mode and generates the G(a*b*) channel, which are color-adapted. The proposed technique uses mappings learned during training phases to translate the source domain to the target domain; mapping are learned using the whole color distribution of the target domain instead of one reference image. The suggested technique outperforms the four state-of-the-art methods in multi-domain OSCC histopathological translation, the claim is supported by results obtained after assessment in both quantitative and qualitative ways.

Keywords: Attention gated generator; Conditional generative adversarial network (cGAN); H&E stain normalization; Histopathology; OSCC; Stain translation.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Carcinoma, Squamous Cell* / diagnostic imaging
  • Color
  • Coloring Agents / chemistry
  • Head and Neck Neoplasms*
  • Humans
  • Image Processing, Computer-Assisted / methods
  • Mouth Neoplasms* / diagnostic imaging
  • Squamous Cell Carcinoma of Head and Neck

Substances

  • Coloring Agents