Deep multimodal graph-based network for survival prediction from highly multiplexed images and patient variables

Comput Biol Med. 2023 Mar:154:106576. doi: 10.1016/j.compbiomed.2023.106576. Epub 2023 Feb 1.

Abstract

The spatial architecture of the tumour microenvironment and phenotypic heterogeneity of tumour cells have been shown to be associated with cancer prognosis and clinical outcomes, including survival. Recent advances in highly multiplexed imaging, including imaging mass cytometry (IMC), capture spatially resolved, high-dimensional maps that quantify dozens of disease-relevant biomarkers at single-cell resolution, that contain potential to inform patient-specific prognosis. Existing automated methods for predicting survival, on the other hand, typically do not leverage spatial phenotype information captured at the single-cell level. Furthermore, there is no end-to-end method designed to leverage the rich information in whole IMC images and all marker channels, and aggregate this information with clinical data in a complementary manner to predict survival with enhanced accuracy. To that end, we present a deep multimodal graph-based network (DMGN) with two modules: (1) a multimodal graph-based module that considers relationships between spatial phenotype information in all image regions and all clinical variables adaptively, and (2) a clinical embedding module that automatically generates embeddings specialised for each clinical variable to enhance multimodal aggregation. We demonstrate that our modules are consistently effective at improving survival prediction performance using two public breast cancer datasets, and that our new approach can outperform state-of-the-art methods in survival prediction.

Keywords: Deep learning; Graph neural network (GNN); Imaging mass cytometry (IMC); Multimodal; Survival analysis.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Humans
  • Neoplasms* / diagnostic imaging
  • Phenotype
  • Tumor Microenvironment*
  • Upper Extremity