Knowledge distillation based on multi-layer fusion features

PLoS One. 2023 Aug 28;18(8):e0285901. doi: 10.1371/journal.pone.0285901. eCollection 2023.

Abstract

Knowledge distillation improves the performance of a small student network by promoting it to learn the knowledge from a pre-trained high-performance but bulky teacher network. Generally, most of the current knowledge distillation methods extract relatively simple features from the middle or bottom layer of teacher network for knowledge transmission. However, the above methods ignore the fusion of features, and the fused features contain richer information. We believe that the richer and better information contained in the knowledge delivered by teachers to students, the easier it is for students to perform better. In this paper, we propose a new method called Multi-feature Fusion Knowledge Distillation (MFKD) to extract and utilize the expressive fusion features of teacher network. Specifically, we extract feature maps from different positions of the network, i.e., the middle layer, the bottom layer, and even the front layer of the network. To properly utilize these features, this method designs a multi-feature fusion scheme to integrate them together. Compared to features extracted from single location of teacher network, the final fusion feature map contains meaningful information. Extensive experiments on image classification tasks demonstrate that the student network trained by our MFKD can learn from the fusion features, leading to superior performance. The results show that MFKD can improve the Top-1 accuracy of ResNet20 and VGG8 by 1.82% and 3.35% respectively on the CIFAR-100 dataset, which is better than state-of-the-art many existing methods.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Dietary Fiber*
  • Distillation
  • Educational Personnel*
  • Humans
  • Knowledge
  • Students

Substances

  • Dietary Fiber

Grants and funding

This work was founded by the Sichuan Science and Technology Program (No. 2022YFG0324) and the National Natural Science Foundation of China (No. 11905153).