A Collaborative Multimodal Learning-Based Framework for COVID-19 Diagnosis

IEEE Trans Neural Netw Learn Syst. 2023 Jul 4:PP. doi: 10.1109/TNNLS.2023.3290188. Online ahead of print.

Abstract

The pandemic of coronavirus disease 2019 (COVID-19) has led to a global public health crisis, which caused millions of deaths and billions of infections, greatly increasing the pressure on medical resources. With the continuous emergence of viral mutations, developing automated tools for COVID-19 diagnosis is highly desired to assist the clinical diagnosis and reduce the tedious workload of image interpretation. However, medical images in a single site are usually of a limited amount or weakly labeled, while integrating data scattered around different institutions to build effective models is not allowed due to data policy restrictions. In this article, we propose a novel privacy-preserving cross-site framework for COVID-19 diagnosis with multimodal data, seeking to effectively leverage heterogeneous data from multiple parties while preserving patients' privacy. Specifically, a Siamese branched network is introduced as the backbone to capture inherent relationships across heterogeneous samples. The redesigned network is capable of handling semisupervised inputs in multimodalities and conducting task-specific training, in order to improve the model performance of various scenarios. The framework achieves significant improvement compared with state-of-the-art methods, as we demonstrate through extensive simulations on real-world datasets.