Entropy-based incremental variational Bayes learning of Gaussian mixtures

IEEE Trans Neural Netw Learn Syst. 2012 Mar;23(3):534-40. doi: 10.1109/TNNLS.2011.2177670.

Abstract

Variational approaches to density estimation and pattern recognition using Gaussian mixture models can be used to learn the model and optimize its complexity simultaneously. In this brief, we develop an incremental entropy-based variational learning scheme that does not require any kind of initialization. The key element of the proposal is to exploit the incremental learning approach to perform model selection through efficient iteration over the variational Bayes optimization step in a way that the number of splits is minimized. The method starts with just one component and adds new components iteratively by splitting the worst fitted kernel in terms of evaluating its entropy. Our experimental results, on synthetic and real data sets show the effectiveness of the approach outperforming other state-of-the-art incremental component learners.

Publication types

  • Research Support, Non-U.S. Gov't