Classification and comparison via neural networks

Neural Netw. 2019 Oct:118:65-80. doi: 10.1016/j.neunet.2019.06.004. Epub 2019 Jun 19.

Abstract

We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley-Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35% and comparison AUC by as much as 6% on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.

Keywords: Classification; Comparison; Joint learning; Neural network; Siamese network.

Publication types

  • Comparative Study

MeSH terms

  • Databases, Factual / classification*
  • Neural Networks, Computer*