Dual Distillation Discriminator Networks for Domain Adaptive Few-Shot Learning

Neural Netw. 2023 Aug:165:625-633. doi: 10.1016/j.neunet.2023.06.009. Epub 2023 Jun 15.

Abstract

Domain Adaptive Few-Shot Learning (DA-FSL) aims at accomplishing few-shot classification tasks on a novel domain with the aid of a large number of source-style samples and several target-style samples. It is essential for DA-FSL to transfer task knowledge from the source domain to the target domain and overcome the asymmetry amount of labeled data in both domains. To this end, we propose Dual Distillation Discriminator Networks (D3Net) from the perspective of the lack of labeled target domain style samples in DA-FSL. Specifically, we employ the idea of distillation discrimination to avoid the over-fitting caused by the unequal number of samples in the target and source domains, which trains the student discriminator by the soft labels from the teacher discriminator. Meanwhile, we design the task propagation stage and the mixed domain stage respectively from the level of feature space and instances to generate more target-style samples, which apply the task distributions and the sample diversity of the source domain to enhance the target domain. Our D3Net realizes the distribution alignment between the source domain and the target domain and constraints the FSL task distribution by prototype distributions on the mixed domain. Extensive experiments on three DA-FSL benchmark datasets, i.e., mini-ImageNet, tiered-ImageNet, and DomainNet, demonstrate that our D3Net achieves competitive performance.

Keywords: Adversarial training; Domain adaption; Few-shot learning; Knowledge distillation.

MeSH terms

  • Benchmarking
  • Distillation*
  • Humans
  • Knowledge
  • Learning*
  • Students