Regularization in matrix relevance learning

IEEE Trans Neural Netw. 2010 May;21(5):831-40. doi: 10.1109/TNN.2010.2042729. Epub 2010 Mar 15.

Abstract

In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can display a tendency towards oversimplification in the course of training. An overly pronounced elimination of dimensions in feature space can have negative effects on the performance and may lead to instabilities in the training. We focus on matrix learning in generalized LVQ (GLVQ). Extending the cost function by an appropriate regularization term prevents the unfavorable behavior and can help to improve the generalization ability. The approach is first tested and illustrated in terms of artificial model data. Furthermore, we apply the scheme to benchmark classification data sets from the UCI Repository of Machine Learning. We demonstrate the usefulness of regularization also in the case of rank limited relevance matrices, i.e., matrix learning with an implicit, low-dimensional representation of the data.

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Feedback*
  • Humans
  • Learning / physiology*
  • Neural Networks, Computer*