Data classification based on fractional order gradient descent with momentum for RBF neural network

Network. 2020 Feb-Nov;31(1-4):166-185. doi: 10.1080/0954898X.2020.1849842. Epub 2020 Dec 6.

Abstract

The weight-updating methods have played an important role in improving the performance of neural networks. To ameliorate the oscillating phenomenon in training radial basis function (RBF) neural network, a fractional order gradient descent with momentum method for updating the weights of RBF neural network (FOGDM-RBF) is proposed for data classification. Its convergence is proved. In order to speed up the convergence process, an adaptive learning rate is used to adjust the training process. The Iris data set and MNIST data set are used to test the proposed algorithm. The results verify the theoretical results of the proposed algorithm such as its monotonicity and convergence. Some non-parametric statistical tests such as Friedman test and Quade test are taken for the comparison of the proposed algorithm with other algorithms. The influence of fractional order, learning rate and batch size is analysed and compared. Error analysis shows that the algorithm can effectively accelerate the convergence speed of gradient descent method and improve its performance with high accuracy and validity.

Keywords: Neural network; data classification; fractional order; gradient descent; momentum.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Databases, Factual / classification*
  • Humans
  • Neural Networks, Computer*