LS-SVR as a Bayesian RBF Network

IEEE Trans Neural Netw Learn Syst. 2020 Oct;31(10):4389-4393. doi: 10.1109/TNNLS.2019.2952000. Epub 2019 Dec 11.

Abstract

We show theoretical similarities between the least squares support vector regression (LS-SVR) model with a radial basis functions (RBFs) kernel and maximum a posteriori (MAP) inference on Bayesian RBF networks with a specific Gaussian prior on the regression weights. Although previous articles have pointed out similar expressions between those learning approaches, we explicitly and formally state the existing correspondences. We empirically demonstrate our result by performing computational experiments with standard regression benchmarks. Our findings open a range of possibilities to improve LS-SVR by borrowing strength from well-established developments in Bayesian methodology.