Scaling up minimum enclosing ball with total soft margin for training on large datasets

Neural Netw. 2012 Dec:36:120-8. doi: 10.1016/j.neunet.2012.09.013. Epub 2012 Oct 3.

Abstract

Recent research indicates that the standard Minimum Enclosing Ball (MEB) or the center-constrained MEB can be used for effective training on large datasets by employing the core vector machine (CVM) or generalized CVM (GCVM). However, for another extensively-used MEB, i.e., MEB with total soft margin (T-MEB for brevity), we cannot directly employ the CVM or GCVM to realize its fast training for large datasets due to the fact that the involved inequality constraint is violated. In this paper, a fast learning algorithm called FL-TMEB for scaling up T-MEB is presented. First, FL-TMEB slightly relaxes the constraints in TMEB such that it can be equivalent to the corresponding center-constrained MEB, which can be solved with the corresponding Core Set (CS) by CVM. Then, with the help of the sub-optimal solution theorem about T-MEB, FL-TMEB attempts to obtain the extended core set (ECS) by including the neighbors of some samples in the CS into the ECS. Finally, FL-TMEB takes the optimal weights of ECS as the approximation solution of T-MEB. Experimental results on UCI and USPS datasets demonstrate that the proposed method is effective.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Classification
  • Image Processing, Computer-Assisted*
  • Support Vector Machine*