Bayesian model search for mixture models based on optimizing variational bounds

Neural Netw. 2002 Dec;15(10):1223-41. doi: 10.1016/s0893-6080(02)00040-0.

Abstract

When learning a mixture model, we suffer from the local optima and model structure determination problems. In this paper, we present a method for simultaneously solving these problems based on the variational Bayesian (VB) framework. First, in the VB framework, we derive an objective function that can simultaneously optimize both model parameter distributions and model structure. Next, focusing on mixture models, we present a deterministic algorithm to approximately optimize the objective function by using the idea of the split and merge operations which we previously proposed within the maximum likelihood framework. Then, we apply the method to mixture of expers (MoE) models to experimentally show that the proposed method can find the optimal number of experts of a MoE while avoiding local maxima.

MeSH terms

  • Algorithms*
  • Bayes Theorem*
  • Neural Networks, Computer*