Regularized correntropy criterion based semi-supervised ELM

Neural Netw. 2020 Feb:122:117-129. doi: 10.1016/j.neunet.2019.09.030. Epub 2019 Oct 9.

Abstract

Along with the explosive growing of data, semi-supervised learning attracts increasing attention in the past years due to its powerful capability in labeling unlabeled data and knowledge mining. As an emerging method, the semi-supervised extreme learning machine (SSELM), that builds on ELM, has been developed for data classification and shown superiorities in learning efficiency and accuracy. However, the optimization of SSELM as well as most of the other ELMs is generally based on the mean square error (MSE) criterion, which has been shown less effective in dealing with non-Gaussian noises. In this paper, a robust regularized correntropy criterion based SSELM (RC-SSELM) is developed. The optimization of the output weight matrix of RC-SSELM is derived by the fixed-point iteration based approach. A convergent analysis of the proposed RC-SSELM is presented based on the half-quadratic optimization technique. Experimental results on 4 synthetic datasets and 13 benchmark UCI datasets are provided to show the superiorities of the proposed RC-SSELM over SSELM and other state-of-the-art methods.

Keywords: Extreme learning machine; Mean square error; Regularized correntropy criterion; Semi-supervised learning.

MeSH terms

  • Benchmarking
  • Supervised Machine Learning*