Triplet-Net Classification of Contiguous Stem Cell Microscopy Images

IEEE/ACM Trans Comput Biol Bioinform. 2023 May-Jun;20(3):2314-2327. doi: 10.1109/TCBB.2023.3247957. Epub 2023 Jun 5.

Abstract

Cellular microscopy imaging is a common form of data acquisition for biological experimentation. Observation of gray-level morphological features allows for the inference of useful biological information such as cellular health and growth status. Cellular colonies can contain multiple cell types, making colony level classification very difficult. Additionally, cell types growing in a hierarchical, downstream fashion, can often look visually similar, although biologically distinct. In this paper, it is determined empirically that traditional deep Convolutional Neural Networks (CNN) and classical object recognition techniques are not sufficient to distinguish between these subtle visual differences, resulting in misclassifications. Instead, Triplet-net CNN learning is employed in a hierarchical classification scheme to improve the ability of the model to discern distinct, fine-grain features of two commonly confused morphological image-patch classes, namely Dense and Spread colonies. The Triplet-net method improves classification accuracy over a four-class deep neural network by ∼ 3 %, a value that was determined to be statistically significant, as well as existing state-of-the-art image patch classification approaches and standard template matching. These findings allow for the accurate classification of multi-class cell colonies with contiguous boundaries, and increased reliability and efficiency of automated, high-throughput experimental quantification using non-invasive microscopy.

MeSH terms

  • Microscopy*
  • Neural Networks, Computer*
  • Reproducibility of Results
  • Stem Cells