Delay-based reservoir computing: noise effects in a combined analog and digital implementation

IEEE Trans Neural Netw Learn Syst. 2015 Feb;26(2):388-93. doi: 10.1109/TNNLS.2014.2311855.

Abstract

Reservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Artificial Intelligence*
  • Computer Simulation*
  • Data Interpretation, Statistical
  • Neural Networks, Computer*
  • Nonlinear Dynamics