Global Asymptotic Stability and Stabilization of Neural Networks With General Noise

IEEE Trans Neural Netw Learn Syst. 2018 Mar;29(3):597-607. doi: 10.1109/TNNLS.2016.2637567. Epub 2016 Dec 29.

Abstract

Neural networks (NNs) in the stochastic environment were widely modeled as stochastic differential equations, which were driven by white noise, such as Brown or Wiener process in the existing papers. However, they are not necessarily the best models to describe dynamic characters of NNs disturbed by nonwhite noise in some specific situations. In this paper, general noise disturbance, which may be nonwhite, is introduced to NNs. Since NNs with nonwhite noise cannot be described by Itô integral equation, a novel modeling method of stochastic NNs is utilized. By a framework in light of random field approach and Lyapunov theory, the global asymptotic stability and stabilization in probability or in the mean square of NNs with general noise are analyzed, respectively. Criteria for the concerned systems based on linear matrix inequality are proposed. Some examples are given to illustrate the effectiveness of the obtained results.

Publication types

  • Research Support, Non-U.S. Gov't