Cross-Modal Clustering With Deep Correlated Information Bottleneck Method

IEEE Trans Neural Netw Learn Syst. 2023 May 23:PP. doi: 10.1109/TNNLS.2023.3269789. Online ahead of print.

Abstract

Cross-modal clustering (CMC) intends to improve the clustering accuracy (ACC) by exploiting the correlations across modalities. Although recent research has made impressive advances, it remains a challenge to sufficiently capture the correlations across modalities due to the high-dimensional nonlinear characteristics of individual modalities and the conflicts in heterogeneous modalities. In addition, the meaningless modality-private information in each modality might become dominant in the process of correlation mining, which also interferes with the clustering performance. To tackle these challenges, we devise a novel deep correlated information bottleneck (DCIB) method, which aims at exploring the correlation information between multiple modalities while eliminating the modality-private information in each modality in an end-to-end manner. Specifically, DCIB treats the CMC task as a two-stage data compression procedure, in which the modality-private information in each modality is eliminated under the guidance of the shared representation of multiple modalities. Meanwhile, the correlations between multiple modalities are preserved from the aspects of feature distributions and clustering assignments simultaneously. Finally, the objective of DCIB is formulated as an objective function based on a mutual information measurement, in which a variational optimization approach is proposed to ensure its convergence. Experimental results on four cross-modal datasets validate the superiority of the DCIB. Code is released at https://github.com/Xiaoqiang-Yan/DCIB.