DRRNets: Dynamic Recurrent Routing via Low-Rank Regularization in Recurrent Neural Networks

IEEE Trans Neural Netw Learn Syst. 2023 Apr;34(4):2057-2067. doi: 10.1109/TNNLS.2021.3105818. Epub 2023 Apr 4.

Abstract

Recurrent neural networks (RNNs) continue to show outstanding performance in sequence learning tasks such as language modeling, but it remains difficult to train RNNs for long sequences. The main challenges lie in the complex dependencies, gradient vanishing or exploding, and low resource requirement in model deployment. In order to address these challenges, we propose dynamic recurrent routing neural networks (DRRNets), which can: 1) shorten the recurrent lengths by allocating recurrent routes dynamically for different dependencies and 2) reduce the number of parameters significantly by imposing low-rank constraints on the fully connected layers. A novel optimization algorithm via low-rank constraint and sparsity projection is developed to train the network. We verify the effectiveness of the proposed method by comparing it with multiple competitive approaches in several popular sequential learning tasks, such as language modeling and speaker recognition. The results in terms of different criteria demonstrate the superiority of our proposed method.