Label smoothing and task-adaptive loss function based on prototype network for few-shot learning

Neural Netw. 2022 Dec:156:39-48. doi: 10.1016/j.neunet.2022.09.018. Epub 2022 Sep 23.

Abstract

Aiming at solving the problems of prototype network that the label information is not reliable enough and that the hyperparameters of the loss function cannot follow the changes of image feature information, we propose a method that combines label smoothing and hyperparameters. First, the label information of an image is processed by label smoothing regularization. Then, according to different classification tasks, the distance matrix and logarithmic operation of the image feature are used to fuse the distance matrix of the image with the hyperparameters of the loss function. Finally, the hyperparameters are associated with the smoothed label and the distance matrix for predictive classification. The method is validated on the miniImageNet, FC100 and tieredImageNet datasets. The results show that, compared with the unsmoothed label and fixed hyperparameters methods, the classification accuracy of the flexible hyperparameters in the loss function under the condition of few-shot learning is improved by 2%-3%. The result shows that the proposed method can suppress the interference of false labels, and the flexibility of hyperparameters can improve classification accuracy.

Keywords: Deep learning; Few-shot learning; Flexible hyperparameters; Image classification; Improved loss function.

MeSH terms

  • Learning*