Transfer Increment for Generalized Zero-Shot Learning

IEEE Trans Neural Netw Learn Syst. 2021 Jun;32(6):2506-2520. doi: 10.1109/TNNLS.2020.3006322. Epub 2021 Jun 2.

Abstract

Zero-shot learning (ZSL) is a successful paradigm for categorizing objects from the previously unseen classes. However, it suffers from severe performance degradation in the generalized ZSL (GZSL) setting, i.e., to recognize the test images that are from both seen and unseen classes. In this article, we present a simple but effective mechanism for GZSL and more open scenarios based on a transfer-increment strategy. On the one hand, a dual-knowledge-source-based generative model is constructed to tackle the missing data problem. Specifically, the local relational knowledge extracted from the label-embedding space and the global relational knowledge, which is the estimated data center in the feature-embedding space, are concurrently considered to synthesize the virtual exemplars. On the other hand, we further explore the training issue for the generative models under the GZSL setting. Two incremental training modes are designed to learn directly the unseen classes from the synthesized exemplars instead of the training classifiers with the seen and synthesized unseen exemplars together. It not only presents an effective unseen class learning but also requires less computing and storage resources in practical application. Comprehensive experiments are conducted based on five benchmark data sets. In comparison with the state-of-the-art methods, both the generating and training processes are considered for virtual exemplars by the proposed transfer-increment strategy, which results in a significant improvement in the conventional and GZSL tasks.