Divergence measures and a general framework for local variational approximation

Neural Netw. 2011 Dec;24(10):1102-9. doi: 10.1016/j.neunet.2011.06.004. Epub 2011 Jun 15.

Abstract

The local variational method is a technique to approximate an intractable posterior distribution in Bayesian learning. This article formulates a general framework for local variational approximation and shows that its objective function is decomposable into the sum of the Kullback information and the expected Bregman divergence from the approximating posterior distribution to the Bayesian posterior distribution. Based on a geometrical argument in the space of approximating posteriors, we propose an efficient method to evaluate an upper bound of the marginal likelihood. Moreover, we demonstrate that the variational Bayesian approach for the latent variable models can be viewed as a special case of this general framework.

MeSH terms

  • Algorithms*
  • Artificial Intelligence*
  • Bayes Theorem*
  • Logistic Models
  • Neural Networks, Computer*
  • Normal Distribution
  • Regression Analysis