Bregman divergences for growing hierarchical self-organizing networks

Int J Neural Syst. 2014 Jun;24(4):1450016. doi: 10.1142/S0129065714500166. Epub 2014 Mar 27.

Abstract

Growing hierarchical self-organizing models are characterized by the flexibility of their structure, which can easily accommodate for complex input datasets. However, most proposals use the Euclidean distance as the only error measure. Here we propose a way to introduce Bregman divergences in these models, which is based on stochastic approximation principles, so that more general distortion measures can be employed. A procedure is derived to compare the performance of networks using different divergences. Moreover, a probabilistic interpretation of the model is provided, which enables its use as a Bayesian classifier. Experimental results are presented for classification and data visualization applications, which show the advantages of these divergences with respect to the classical Euclidean distance.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Humans
  • Models, Neurological*
  • Neural Networks, Computer*
  • Weights and Measures