Block-cyclic stochastic coordinate descent for deep neural networks

Neural Netw. 2021 Jul:139:348-357. doi: 10.1016/j.neunet.2021.04.001. Epub 2021 Apr 19.

Abstract

We present a stochastic first-order optimization algorithm, named block-cyclic stochastic coordinate descent (BCSC), that adds a cyclic constraint to stochastic block-coordinate descent in the selection of both data and parameters. It uses different subsets of the data to update different subsets of the parameters, thus limiting the detrimental effect of outliers in the training set. Empirical tests in image classification benchmark datasets show that BCSC outperforms state-of-the-art optimization methods in generalization leading to higher accuracy within the same number of update iterations. The improvements are consistent across different architectures and datasets, and can be combined with other training techniques and regularizations.

Keywords: Coordinate descent; Deep neural network; Energy optimization; Stochastic gradient descent.

MeSH terms

  • Benchmarking
  • Classification / methods
  • Datasets as Topic
  • Image Processing, Computer-Assisted / methods*
  • Image Processing, Computer-Assisted / standards
  • Neural Networks, Computer*
  • Pattern Recognition, Automated / methods*
  • Pattern Recognition, Automated / standards
  • Stochastic Processes