Weighted Incremental-Decremental Support Vector Machines for concept drift with shifting window

Neural Netw. 2022 Aug:152:528-541. doi: 10.1016/j.neunet.2022.05.018. Epub 2022 May 27.

Abstract

We study the problem of learning the data samples' distribution as it changes in time. This change, known as concept drift, complicates the task of training a model, as the predictions become less and less accurate. It is known that Support Vector Machines (SVMs) can learn weighted input instances and that they can also be trained online (incremental-decremental learning). Combining these two SVM properties, the open problem is to define an online SVM concept drift model with shifting weighted window. The classic SVM model should be retrained from scratch after each window shift. We introduce the Weighted Incremental-Decremental SVM (WIDSVM), a generalization of the incremental-decremental SVM for shifting windows. WIDSVM is capable of learning from data streams with concept drift, using the weighted shifting window technique. The soft margin constrained optimization problem imposed on the shifting window is reduced to an incremental-decremental SVM. At each window shift, we determine the exact conditions for vector migration during the incremental-decremental process. We perform experiments on artificial and real-world concept drift datasets; they show that the classification accuracy of WIDSVM significantly improves compared to a SVM with no shifting window. The WIDSVM training phase is fast, since it does not retrain from scratch after each window shift.

Keywords: Concept drift; Incremental learning; Shifting window; Support Vector Machines.

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Models, Theoretical
  • Support Vector Machine*