Boosting lesion annotation via aggregating explicit relations in external medical knowledge graph

Artif Intell Med. 2022 Oct:132:102376. doi: 10.1016/j.artmed.2022.102376. Epub 2022 Aug 22.

Abstract

Predicting a comprehensive set of relevant labels on chest X-ray images faces great challenges towards bridging visual and textual modalities. Despite the success of Graph Convolutional Networks (GCN) on modeling label dependencies using co-occurrence matrix generated from dataset, they still suffer from inherent label imbalance in dataset and ignore the explicit relations among labels presented in external medical knowledge graph (KG). We argue that jointly exploiting both the label co-occurrence matrix in dataset and the label relations in external knowledge graph facilitates multi-label lesion annotation. To model relevant lesion labels more comprehensively, we propose a KG-augmented model via Aggregating Explicit Relations for multi-label lesion annotation, called AER-GCN. The KG-augmented model employs GCN to learn the explicit label relations in external medical KG, and aggregates the explicit relations into statistical graph built from label co-occurrence information. Specially, we present three approaches on modeling the explicit label correlations in external knowledge, and two approaches on incorporating the explicit relations into co-occurrence relations for lesion annotation. We exploit SNOMED CT as the source of external knowledge and evaluate the performance of AER-GCN on the ChestX-ray and IU X-ray datasets. Extensive experiments demonstrate that our model outperforms other state-of-the-art models.

Keywords: Knowledge graph; Lesion annotation; Multi-label Image Classification.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Pattern Recognition, Automated*