Relaxed conditions for radial-basis function networks to be universal approximators

Neural Netw. 2003 Sep;16(7):1019-28. doi: 10.1016/S0893-6080(02)00227-7.

Abstract

In this paper, we investigate the universal approximation property of Radial Basis Function (RBF) networks. We show that RBFs are not required to be integrable for the REF networks to be universal approximators. Instead, RBF networks can uniformly approximate any continuous function on a compact set provided that the radial basis activation function is continuous almost everywhere, locally essentially bounded, and not a polynomial. The approximation in L(p)(micro)(1 < or = p < infinity) space is also discussed. Some experimental results are reported to illustrate our findings.

MeSH terms

  • Neural Networks, Computer*