A new constant memory recursion for hidden Markov models

J Comput Biol. 2014 Feb;21(2):99-117. doi: 10.1089/cmb.2013.0096. Epub 2013 Oct 25.

Abstract

We develop the recursion for hidden Markov (HM) models proposed by Bartolucci and Besag (2002), and we show how it may be used to implement an estimation algorithm for these models that requires an amount of memory not depending on the length of the observed series of data. This recursion allows us to obtain the conditional distribution of the latent state at every occasion, given the previous state and the observed data. With respect to the estimation algorithm based on the well-known Baum-Welch recursions, which requires an amount of memory that increases with the sample size, the proposed algorithm also has the advantage of not requiring dummy renormalizations to avoid numerical problems. Moreover, it directly allows us to perform global decoding of the latent sequence of states, without the need of a Viterbi method and with a consistent reduction of the memory requirement with respect to the latter. The proposed approach is compared, in terms of computing time and memory requirement, with the algorithm based on the Baum-Welch recursions and with the so-called linear memory algorithm of Churbanov and Winters-Hilt. The comparison is also based on a series of simulations involving an HM model for continuous time-series data.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Linear Models
  • Markov Chains*
  • Models, Statistical