Incremental learning by message passing in hierarchical temporal memory

Neural Comput. 2014 Aug;26(8):1763-809. doi: 10.1162/NECO_a_00617. Epub 2014 May 30.

Abstract

Hierarchical temporal memory (HTM) is a biologically inspired framework that can be used to learn invariant representations of patterns in a wide range of applications. Classical HTM learning is mainly unsupervised, and once training is completed, the network structure is frozen, thus making further training (i.e., incremental learning) quite critical. In this letter, we develop a novel technique for HTM (incremental) supervised learning based on gradient descent error minimization. We prove that error backpropagation can be naturally and elegantly implemented through native HTM message passing based on belief propagation. Our experimental results demonstrate that a two-stage training approach composed of unsupervised pretraining and supervised refinement is very effective (both accurate and efficient). This is in line with recent findings on other deep architectures.

MeSH terms

  • Algorithms
  • Information Theory
  • Memory*
  • Neural Networks, Computer*
  • Pattern Recognition, Automated