ComCo: Complementary supervised contrastive learning for complementary label learning

Neural Netw. 2024 Jan:169:44-56. doi: 10.1016/j.neunet.2023.10.013. Epub 2023 Oct 14.

Abstract

Complementary label learning (CLL) is an important problem that aims to reduce the cost of obtaining large-scale accurate datasets by only allowing each training sample to be equipped with labels the sample does not belong. Despite its promise, CLL remains a challenging task. Previous methods have proposed new loss functions or introduced deep learning-based models to CLL, but they mostly overlook the semantic information that may be implicit in the complementary labels. In this work, we propose a novel method, ComCo, which leverages a contrastive learning framework to assist CLL. Our method includes two key strategies: a positive selection strategy that identifies reliable positive samples and a negative selection strategy that skillfully integrates and leverages the information in the complementary labels to construct a negative set. These strategies bring ComCo closer to supervised contrastive learning. Empirically, ComCo significantly achieves better representation learning and outperforms the baseline models and the current state-of-the-art by up to 14.61% in CLL.

Keywords: Complementary label learning; Contrastive learning; Machine learning; Representation learning; Weakly supervised learning.

MeSH terms

  • Humans
  • Leukemia, Lymphocytic, Chronic, B-Cell*
  • Semantics