Segmented-memory recurrent neural networks

IEEE Trans Neural Netw. 2009 Aug;20(8):1267-80. doi: 10.1109/TNN.2009.2022980. Epub 2009 Jul 14.

Abstract

Conventional recurrent neural networks (RNNs) have difficulties in learning long-term dependencies. To tackle this problem, we propose an architecture called segmented-memory recurrent neural network (SMRNN). A symbolic sequence is broken into segments and then presented as inputs to the SMRNN one symbol per cycle. The SMRNN uses separate internal states to store symbol-level context, as well as segment-level context. The symbol-level context is updated for each symbol presented for input. The segment-level context is updated after each segment. The SMRNN is trained using an extended real-time recurrent learning algorithm. We test the performance of SMRNN on the information latching problem, the "two-sequence problem" and the problem of protein secondary structure (PSS) prediction. Our implementation results indicate that SMRNN performs better on long-term dependency problems than conventional RNNs. Besides, we also theoretically analyze how the segmented memory of SMRNN helps learning long-term temporal dependencies and study the impact of the segment length.

MeSH terms

  • Algorithms
  • Artificial Intelligence
  • Forecasting / methods
  • Humans
  • Memory*
  • Neural Networks, Computer*
  • Protein Structure, Secondary
  • Proteins / chemistry
  • Proteins / genetics
  • Sequence Analysis, Protein

Substances

  • Proteins