Prototype Completion for Few-Shot Learning

IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):12250-12268. doi: 10.1109/TPAMI.2023.3277881. Epub 2023 Sep 5.

Abstract

Few-shot learning (FSL) aims to recognize novel classes with few examples. Pre-training based methods effectively tackle the problem by pre-training a feature extractor and then fine-tuning it through the nearest centroid based meta-learning. However, results show that the fine-tuning step makes marginal improvements. In this paper, 1) we figure out the reason, i.e., in the pre-trained feature space, the base classes already form compact clusters while novel classes spread as groups with large variances, which implies that fine-tuning feature extractor is less meaningful; 2) instead of fine-tuning feature extractor, we focus on estimating more representative prototypes. Consequently, we propose a novel prototype completion based meta-learning framework. This framework first introduces primitive knowledge (i.e., class-level part or attribute annotations) and extracts representative features for seen attributes as priors. Second, a part/attribute transfer network is designed to learn to infer the representative features for unseen attributes as supplementary priors. Finally, a prototype completion network is devised to learn to complete prototypes with these priors. Moreover, to avoid the prototype completion error, we further develop a Gaussian based prototype fusion strategy that fuses the mean-based and completed prototypes by exploiting the unlabeled samples. At last, we also develop an economic prototype completion version for FSL, which does not need to collect primitive knowledge, for a fair comparison with existing FSL methods without external knowledge. Extensive experiments show that our method: i) obtains more accurate prototypes; ii) achieves superior performance on both inductive and transductive FSL settings.