Recurrent transform learning

Neural Netw. 2019 Oct:118:271-279. doi: 10.1016/j.neunet.2019.07.003. Epub 2019 Jul 15.

Abstract

Recurrent neural networks (RNN) model time series by feeding back the representation from the previous time instant as an input for the current instant along with exogenous inputs. Two main shortcomings of RNN are - 1. The problem of vanishing gradients while backpropagating through time, and 2. Inability to learn in an unsupervised manner. Variants like long-short term memory (LSTM) network and gated recurrent units (GRU) have partially circumvented the first issue; the second issue still remains. In this work we propose a new variant of RNN based on the transform learning model - named recurrent transform learning (RTL). It can learn in an unsupervised, supervised and semi-supervised fashion; it does not require backpropagation and hence do not suffer from the pitfalls of vanishing gradients. The proposed model is applied on a real-life example of short-term load forecasting, where we show that RTL improves over existing variants of RNN as well as on a state-of-the-art technique in load forecasting based on sparse coding.

Keywords: Demand forecasting; Dynamical model; Load forecasting; Transform learning.

MeSH terms

  • Forecasting
  • Machine Learning*
  • Neural Networks, Computer*