Improved GWO and its application in parameter optimization of Elman neural network

PLoS One. 2023 Jul 7;18(7):e0288071. doi: 10.1371/journal.pone.0288071. eCollection 2023.

Abstract

Traditional neural networks used gradient descent methods to train the network structure, which cannot handle complex optimization problems. We proposed an improved grey wolf optimizer (SGWO) to explore a better network structure. GWO was improved by using circle population initialization, information interaction mechanism and adaptive position update to enhance the search performance of the algorithm. SGWO was applied to optimize Elman network structure, and a new prediction method (SGWO-Elman) was proposed. The convergence of SGWO was analyzed by mathematical theory, and the optimization ability of SGWO and the prediction performance of SGWO-Elman were examined using comparative experiments. The results show: (1) the global convergence probability of SGWO was 1, and its process was a finite homogeneous Markov chain with an absorption state; (2) SGWO not only has better optimization performance when solving complex functions of different dimensions, but also when applied to Elman for parameter optimization, SGWO can significantly optimize the network structure and SGWO-Elman has accurate prediction performance.

MeSH terms

  • Algorithms*
  • Neural Networks, Computer*

Grants and funding

This work was supported by the National Natural Science Foundation of China from Guangwei Liu under Grant numbers 51974144, Liaoning Provincial Department of Education Project from Wei Liu under Grant numbers LJKZ0340, and the discipline innovation team of Liaoning Technical University from Guangwei Liu and Wei Liu under Grant numbers LNTU20TD-01, LNTU20TD-07. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.