Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting

IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2010-2022. doi: 10.1109/TNNLS.2021.3099700. Epub 2022 May 2.

Abstract

It is widely acknowledged that biological intelligence is capable of learning continually without forgetting previously learned skills. Unfortunately, it has been widely observed that many artificial intelligence techniques, especially (deep) neural network (NN)-based ones, suffer from catastrophic forgetting problem, which severely forgets previous tasks when learning a new one. How to train NNs without catastrophic forgetting, which is termed continual learning, is emerging as a frontier topic and attracting considerable research interest. Inspired by memory replay and synaptic consolidation mechanism in brain, in this article, we propose a novel and simple framework termed memory recall (MeRec) for continual learning with deep NNs. In particular, we first analyze the feature stability across tasks in NN and show that NN can yield task stable features in certain layers. Then, based on this observation, we use a memory module to keep the feature statistics (mean and std) for each learned task. Based on the memory and statistics, we show that a simple replay strategy with Gaussian distribution-based feature regeneration can recall and recover the knowledge from previous tasks. Together with the weight regularization, MeRec preserves weights learned from previous tasks. Based on this simple framework, MeRec achieved leading performance with extremely small memory budget (only two feature vectors for each class) for continual learning on CIFAR-10 and CIFAR-100 datasets, with at least 50% accuracy drop reduction after several tasks compared to previous state-of-the-art approaches.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Artificial Intelligence*
  • Brain
  • Learning
  • Memory
  • Neural Networks, Computer*