Normalization Techniques in Training DNNs: Methodology, Analysis and Application

IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):10173-10196. doi: 10.1109/TPAMI.2023.3250241. Epub 2023 Jun 30.

Abstract

Normalization techniques are essential for accelerating the training and improving the generalization of deep neural networks (DNNs), and have successfully been used in various applications. This paper reviews and comments on the past, present and future of normalization methods in the context of DNN training. We provide a unified picture of the main motivation behind different approaches from the perspective of optimization, and present a taxonomy for understanding the similarities and differences between them. Specifically, we decompose the pipeline of the most representative normalizing activation methods into three components: the normalization area partitioning, normalization operation and normalization representation recovery. In doing so, we provide insight for designing new normalization technique. Finally, we discuss the current progress in understanding normalization methods, and provide a comprehensive review of the applications of normalization for particular tasks, in which it can effectively solve the key issues.

Publication types

  • Review

MeSH terms

  • Algorithms
  • Deep Learning*
  • Humans
  • Neural Networks, Computer*