Nonlinear Kernel Support Vector Machine with 0-1 Soft Margin Loss

03/01/2022
by   Ju Liu, et al.
0

Recent advance on linear support vector machine with the 0-1 soft margin loss (L_0/1-SVM) shows that the 0-1 loss problem can be solved directly. However, its theoretical and algorithmic requirements restrict us extending the linear solving framework to its nonlinear kernel form directly, the absence of explicit expression of Lagrangian dual function of L_0/1-SVM is one big deficiency among of them. In this paper, by applying the nonparametric representation theorem, we propose a nonlinear model for support vector machine with 0-1 soft margin loss, called L_0/1-KSVM, which cunningly involves the kernel technique into it and more importantly, follows the success on systematically solving its linear task. Its optimal condition is explored theoretically and a working set selection alternating direction method of multipliers (ADMM) algorithm is introduced to acquire its numerical solution. Moreover, we firstly present a closed-form definition to the support vector (SV) of L_0/1-KSVM. Theoretically, we prove that all SVs of L_0/1-KSVM are only located on the parallel decision surfaces. The experiment part also shows that L_0/1-KSVM has much fewer SVs, simultaneously with a decent predicting accuracy, when comparing to its linear peer L_0/1-SVM and the other six nonlinear benchmark SVM classifiers.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset