Stochastic complexities of general mixture models in variational Bayesian learning

Neural Netw. 2007 Mar;20(2):210-9. doi: 10.1016/j.neunet.2006.05.030. Epub 2006 Aug 10.

Abstract

In this paper, we focus on variational Bayesian learning of general mixture models. Variational Bayesian learning was proposed as an approximation of Bayesian learning. While it has provided computational tractability and good generalization in many applications, little has been done to investigate its theoretical properties. The asymptotic form was obtained for the stochastic complexity, or the free energy in the variational Bayesian learning of a mixture of exponential-family distributions, which is the main contribution this paper makes. We reveal that the stochastic complexities become smaller than those of regular statistical models, which implies that the advantages of Bayesian learning are still retained in variational Bayesian learning. Moreover, the derived bounds indicate what influence the hyperparameters have on the learning process, and the accuracy of the variational Bayesian approach as an approximation of true Bayesian learning.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animals
  • Artificial Intelligence*
  • Bayes Theorem*
  • Humans
  • Learning*
  • Neural Networks, Computer*
  • Stochastic Processes*