Adaptive Dropout Method Based on Biological Principles

IEEE Trans Neural Netw Learn Syst. 2021 Sep;32(9):4267-4276. doi: 10.1109/TNNLS.2021.3070895. Epub 2021 Aug 31.

Abstract

Dropout is one of the most widely used methods to avoid overfitting neural networks. However, it rigidly and randomly activates neurons according to a fixed probability, which is not consistent with the activation mode of neurons in the human cerebral cortex. Inspired by gene theory and the activation mechanism of brain neurons, we propose a more intelligent adaptive dropout, in which a variational self-encoder (VAE) overlaps to an existing neural network to regularize its hidden neurons by adaptively setting activities to zero. Through alternating iterative training, the discarding probability of each hidden neuron can be learned according to the weights and thus effectively avoid the shortcomings of the standard dropout method. The experimental results in multiple data sets illustrate that this method can better suppress overfitting in various neural networks than can the standard dropout. Additionally, this adaptive dropout technique can reduce the number of neurons and improve training efficiency.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Cerebral Cortex / physiology
  • Deep Learning
  • Humans
  • Models, Neurological
  • Neural Networks, Computer*
  • Neurons