A Probabilistic Formulation for Meta-Weight-Net

IEEE Trans Neural Netw Learn Syst. 2023 Mar;34(3):1194-1208. doi: 10.1109/TNNLS.2021.3105104. Epub 2023 Feb 28.

Abstract

In the last decade, deep neural networks (DNNs) have become dominant tools for various of supervised learning tasks, especially classification. However, it is demonstrated that they can easily overfit to training set biases, such as label noise and class imbalance. Example reweighting algorithms are simple and effective solutions against this issue, but most of them require manually specifying the weighting functions as well as additional hyperparameters. Recently, a meta-learning-based method Meta-Weight-Net (MW-Net) has been proposed to automatically learn the weighting function parameterized by an MLP via additional unbiased metadata, which significantly improves the robustness of prior arts. The method, however, is proposed in a deterministic manner, and short of intrinsic statistical support. In this work, we propose a probabilistic formulation for MW-Net, probabilistic MW-Net (PMW-Net) in short, which treats the weighting function in a probabilistic way, and can include the original MW-Net as a special case. By this probabilistic formulation, additional randomness is introduced while the flexibility of the weighting function can be further controlled during learning. Our experimental results on both synthetic and real datasets show that the proposed method improves the performance of the original MW-Net. Besides, the proposed PMW-Net can also be further extended to fully Bayesian models, to improve their robustness.