SLTRN: Sample-level transformer-based relation network for few-shot classification

Neural Netw. 2024 May 3:176:106344. doi: 10.1016/j.neunet.2024.106344. Online ahead of print.

Abstract

Few-shot classification recognizes novel categories with limited labeled samples. The classic Relation Network (RN) compares support-query sample pairs for few-shot classification but overlooks support set contextual information, limiting its comparison capabilities. This work reformulates learning the relationship between query samples and each support class as a seq2seq problem. We introduce a Sample-level Transformer-based Relation Network (SLTRN) that utilizes sample-level self-attention to enhance the comparison ability of the relationship module by mining potential relationships among support classes. SLTRN demonstrates comparable performance with state-of-the-art methods on benchmarks, particularly excelling in the 1-shot setting with 52.11% and 67.55% accuracy on miniImageNet and CUB, respectively. Extensive ablation experiments validate the effectiveness and optimal settings of SLTRN. The experimental code for this work is available at https://github.com/ZitZhengWang/SLTRN.

Keywords: Few-shot learning; SLTRM; SLTRN; Transformer.