Merging weighted SVMs for parallel incremental learning

Neural Netw. 2018 Apr:100:25-38. doi: 10.1016/j.neunet.2018.01.001. Epub 2018 Feb 2.

Abstract

Parallel incremental learning is an effective approach for rapidly processing large scale data streams, where parallel and incremental learning are often treated as two separate problems and solved one after another. Incremental learning can be implemented by merging knowledge from incoming data and parallel learning can be performed by merging knowledge from simultaneous learners. We propose to simultaneously solve the two learning problems with a single process of knowledge merging, and we propose parallel incremental wESVM (weighted Extreme Support Vector Machine) to do so. Here, wESVM is reformulated such that knowledge from subsets of training data can be merged via simple matrix addition. As such, the proposed algorithm is able to conduct parallel incremental learning by merging knowledge over data slices arriving at each incremental stage. Both theoretical and experimental studies show the equivalence of the proposed algorithm to batch wESVM in terms of learning effectiveness. In particular, the algorithm demonstrates desired scalability and clear speed advantages to batch retraining.

Keywords: Extreme support vector machine (ESVM); Incremental learning; Knowledge merging; Parallel incremental learning; Parallel learning; Weighted ESVM (wESVM).

MeSH terms

  • Algorithms
  • Knowledge
  • Learning
  • Supervised Machine Learning* / trends
  • Support Vector Machine* / trends