Adaptive Neural Network Structure Optimization Algorithm Based on Dynamic Nodes

Curr Issues Mol Biol. 2022 Feb 7;44(2):817-832. doi: 10.3390/cimb44020056.

Abstract

Large-scale artificial neural networks have many redundant structures, making the network fall into the issue of local optimization and extended training time. Moreover, existing neural network topology optimization algorithms have the disadvantage of many calculations and complex network structure modeling. We propose a Dynamic Node-based neural network Structure optimization algorithm (DNS) to handle these issues. DNS consists of two steps: the generation step and the pruning step. In the generation step, the network generates hidden layers layer by layer until accuracy reaches the threshold. Then, the network uses a pruning algorithm based on Hebb's rule or Pearson's correlation for adaptation in the pruning step. In addition, we combine genetic algorithm to optimize DNS (GA-DNS). Experimental results show that compared with traditional neural network topology optimization algorithms, GA-DNS can generate neural networks with higher construction efficiency, lower structure complexity, and higher classification accuracy.

Keywords: Adaptive Neural Network Structure; Hebb’s rule; Pearson correlation coefficient; genetic algorithm.