Phase transition for parameter learning of hidden Markov models

Phys Rev E. 2021 Oct;104(4-1):044105. doi: 10.1103/PhysRevE.104.044105.

Abstract

We study a phase transition in parameter learning of hidden Markov models (HMMs). We do this by generating sequences of observed symbols from given discrete HMMs with uniformly distributed transition probabilities and a noise level encoded in the output probabilities. We apply the Baum-Welch (BW) algorithm, an expectation-maximization algorithm from the field of machine learning. By using the BW algorithm we then try to estimate the parameters of each investigated realization of an HMM. We study HMMs with n=4,8, and 16 states. By changing the amount of accessible learning data and the noise level, we observe a phase-transition-like change in the performance of the learning algorithm. For bigger HMMs and more learning data, the learning behavior improves tremendously below a certain threshold in the noise strength. For a noise level above the threshold, learning is not possible. Furthermore, we use an overlap parameter applied to the results of a maximum a posteriori (Viterbi) algorithm to investigate the accuracy of the hidden state estimation around the phase transition.