Negatives Make a Positive: An Embarrassingly Simple Approach to Semi-Supervised Few-Shot Learning

IEEE Trans Pattern Anal Mach Intell. 2024 Apr;46(4):2091-2103. doi: 10.1109/TPAMI.2023.3333528. Epub 2024 Mar 6.

Abstract

Semi-Supervised Few-Shot Learning (SSFSL) aims to train a classifier that can adapt to new tasks using limited labeled data and a fixed amount of unlabeled data. Various sophisticated methods have been proposed to tackle the challenges associated with this problem. In this paper, we present a simple but quite effective approach to predict accurate negative pseudo-labels of unlabeled data from an indirect learning perspective. We leverage these pseudo-labels to augment the support set, which is typically limited in few-shot tasks, e.g., 1-shot classification. In such label-constrained scenarios, our approach can offer highly accurate negative pseudo-labels. By iteratively excluding negative pseudo-labels one by one, we ultimately derive a positive pseudo-label for each unlabeled sample in our approach. The integration of negative and positive pseudo-labels complements the limited support set, resulting in significant accuracy improvements for SSFSL. Our approach can be implemented in just few lines of code by only using off-the-shelf operations, yet it outperforms state-of-the-art methods on four benchmark datasets. Furthermore, our approach exhibits good adaptability and generalization capabilities when used as a plug-and-play counterpart alongside existing SSFSL methods and when extended to generalized linear models.