On Wang $k$ WTA With Input Noise, Output Node Stochastic, and Recurrent State Noise

IEEE Trans Neural Netw Learn Syst. 2018 Sep;29(9):4212-4222. doi: 10.1109/TNNLS.2017.2759905. Epub 2017 Oct 27.

Abstract

In this paper, the effect of input noise, output node stochastic, and recurrent state noise on the Wang $k$ WTA is analyzed. Here, we assume that noise exists at the recurrent state $y(t)$ and it can either be additive or multiplicative. Besides, its dynamical change (i.e., $dy/dt$ ) is corrupted by noise as well. In sequel, we model the dynamics of $y(t)$ as a stochastic differential equation and show that the stochastic behavior of $y(t)$ is equivalent to an Ito diffusion. Its stationary distribution is a Gibbs distribution, whose modality depends on the noise condition. With moderate input noise and very small recurrent state noise, the distribution is single modal and hence $y(\infty )$ has high probability varying within the input values of the $k$ and $k+1$ winners (i.e., correct output). With small input noise and large recurrent state noise, the distribution could be multimodal and hence $y(\infty )$ could have probability varying outside the input values of the $k$ and $k+1$ winners (i.e., incorrect output). In this regard, we further derive the conditions that the $k$ WTA has high probability giving correct output. Our results reveal that recurrent state noise could have severe effect on Wang $k$ WTA. But, input noise and output node stochastic could alleviate such an effect.

Publication types

  • Research Support, Non-U.S. Gov't