Online continual learning with declarative memory

Neural Netw. 2023 Jun:163:146-155. doi: 10.1016/j.neunet.2023.03.025. Epub 2023 Mar 27.

Abstract

Deep neural networks are enjoying unprecedented attention and success in recent years. However, catastrophic forgetting undermines the performance of deep models when the training data are arrived sequentially in an online multi-task learning fashion. To address this issue, we propose a novel method named continual learning with declarative memory (CLDM) in this paper. Specifically, our idea is inspired by the structure of human memory. Declarative memory is a major component of long-term memory which helps human beings memorize past experiences and facts. In this paper, we propose to formulate declarative memory as task memory and instance memory in neural networks to overcome catastrophic forgetting. Intuitively, the instance memory recalls the input-output relations (fact) in previous tasks, which is implemented by jointly rehearsing previous samples and learning current tasks as replaying-based methods act. In addition, the task memory aims to capture long-term task correlation information across task sequences to regularize the learning of the current task, thus preserving task-specific weight realizations (experience) in high task-specific layers. In this work, we implement a concrete instantiation of the proposed task memory by leveraging a recurrent unit. Extensive experiments on seven continual learning benchmarks verify that our proposed method is able to outperform previous approaches with tremendous improvements by retaining the information of both samples and tasks.

Keywords: Catastrophic forgetting; Continual learning; Declarative memory; Deep neural networks; Long-term memory.

MeSH terms

  • Cognition
  • Humans
  • Learning*
  • Machine Learning
  • Mental Recall
  • Neural Networks, Computer*