Incremental learning algorithm for large-scale semi-supervised ordinal regression

Neural Netw. 2022 May:149:124-136. doi: 10.1016/j.neunet.2022.02.004. Epub 2022 Feb 11.

Abstract

As a special case of multi-classification, ordinal regression (also known as ordinal classification) is a popular method to tackle the multi-class problems with samples marked by a set of ranks. Semi-supervised ordinal regression (SSOR) is especially important for data mining applications because semi-supervised learning can make use of the unlabeled samples to train a high-quality learning model. However, the training of large-scale SSOR is still an open question due to its complicated formulations and non-convexity to the best of our knowledge. To address this challenging problem, in this paper, we propose an incremental learning algorithm for SSOR (IL-SSOR), which can directly update the solution of SSOR based on the KKT conditions. More critically, we analyze the finite convergence of IL-SSOR which guarantees that SSOR can converge to a local minimum based on the framework of concave-convex procedure. To the best of our knowledge, the proposed new algorithm is the first efficient on-line learning algorithm for SSOR with local minimum convergence guarantee. The experimental results show, IL-SSOR can achieve better generalization than other semi-supervised multi-class algorithms. Compared with other semi-supervised ordinal regression algorithms, our experimental results show that IL-SSOR can achieve similar generalization with less running time.

Keywords: Concave–Convex procedure algorithm; Incremental learning; Path following algorithm; Semi-supervised ordinal regression.

MeSH terms

  • Algorithms*
  • Data Mining
  • Supervised Machine Learning*