A new adaptive backpropagation algorithm based on Lyapunov stability theory for neural networks

IEEE Trans Neural Netw. 2006 Nov;17(6):1580-91. doi: 10.1109/TNN.2006.880360.

Abstract

A new adaptive backpropagation (BP) algorithm based on Lyapunov stability theory for neural networks is developed in this paper. It is shown that the candidate of a Lyapunov function V(k) of the tracking error between the output of a neural network and the desired reference signal is chosen first, and the weights of the neural network are then updated, from the output layer to the input layer, in the sense that deltaV(k) = V(k) - V(k - 1) < 0. The output tracking error can then asymptotically converge to zero according to Lyapunov stability theory. Unlike gradient-based BP training algorithms, the new Lyapunov adaptive BP algorithm in this paper is not used for searching the global minimum point along the cost-function surface in the weight space, but it is aimed at constructing an energy surface with a single global minimum point through the adaptive adjustment of the weights as the time goes to infinity. Although a neural network may have bounded input disturbances, the effects of the disturbances can be eliminated, and asymptotic error convergence can be obtained. The new Lyapunov adaptive BP algorithm is then applied to the design of an adaptive filter in the simulation example to show the fast error convergence and strong robustness with respect to large bounded input disturbances.

MeSH terms

  • Algorithms*
  • Information Storage and Retrieval / methods*
  • Information Theory*
  • Neural Networks, Computer*
  • Pattern Recognition, Automated / methods*
  • Signal Processing, Computer-Assisted*
  • Systems Theory