Pyramid-based self-supervised learning for histopathological image classification

Comput Biol Med. 2023 Oct:165:107336. doi: 10.1016/j.compbiomed.2023.107336. Epub 2023 Aug 11.

Abstract

Large-scale labeled datasets are crucial for the success of supervised learning in medical imaging. However, annotating histopathological images is a time-consuming and labor-intensive task that requires highly trained professionals. To address this challenge, self-supervised learning (SSL) can be utilized to pre-train models on large amounts of unsupervised data and transfer the learned representations to various downstream tasks. In this study, we propose a self-supervised Pyramid-based Local Wavelet Transformer (PLWT) model for effectively extracting rich image representations. The PLWT model extracts both local and global features to pre-train a large number of unlabeled histopathology images in a self-supervised manner. Wavelet is used to replace average pooling in the downsampling of the multi-head attention, achieving a significant reduction in information loss during the transmission of image features. Additionally, we introduce a Local Squeeze-and-Excitation (Local SE) module in the feedforward network in combination with the inverse residual to capture local image information. We evaluate PLWT's performance on three histopathological images and demonstrate the impact of pre-training. Our experiment results indicate that PLWT with self-supervised learning performs highly competitive when compared with other SSL methods, and the transferability of visual representations generated by SSL on domain-relevant histopathological images exceeds that of the supervised baseline trained on ImageNet.

Keywords: Histopathological image; Pyramid-based transformer; Self-supervised learning.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Female
  • Humans
  • Labor, Obstetric*
  • Pregnancy
  • Supervised Machine Learning