Multi-task nonparallel support vector machine for classification

04/05/2022
by   Zongmin Liu, et al.
0

Direct multi-task twin support vector machine (DMTSVM) explores the shared information between multiple correlated tasks, then it produces better generalization performance. However, it contains matrix inversion operation when solving the dual problems, so it costs much running time. Moreover, kernel trick cannot be directly utilized in the nonlinear case. To effectively avoid above problems, a novel multi-task nonparallel support vector machine (MTNPSVM) including linear and nonlinear cases is proposed in this paper. By introducing epsilon-insensitive loss instead of square loss in DMTSVM, MTNPSVM effectively avoids matrix inversion operation and takes full advantage of the kernel trick. Theoretical implication of the model is further discussed. To further improve the computational efficiency, the alternating direction method of multipliers (ADMM) is employed when solving the dual problem. The computational complexity and convergence of the algorithm are provided. In addition, the property and sensitivity of the parameter in model are further explored. The experimental results on fifteen benchmark datasets and twelve image datasets demonstrate the validity of MTNPSVM in comparison with the state-of-the-art algorithms. Finally, it is applied to real Chinese Wine dataset, and also verifies its effectiveness.

READ FULL TEXT

page 22

page 23

research
03/01/2022

Nonlinear Kernel Support Vector Machine with 0-1 Soft Margin Loss

Recent advance on linear support vector machine with the 0-1 soft margin...
research
10/06/2022

GBSVM: Granular-ball Support Vector Machine

GBSVM (Granular-ball Support Vector Machine) is an important attempt to ...
research
06/22/2022

Multi-task twin support vector machine with Universum data

Multi-task learning (MTL) has emerged as a promising topic of machine le...
research
05/30/2020

Solution Path Algorithm for Twin Multi-class Support Vector Machine

The twin support vector machine and its extensions have made great achie...
research
08/26/2022

Splitting Method for Support Vector Machine with Lower Semi-continuous Loss

In this paper, we study the splitting method based on alternating direct...
research
11/21/2019

Random Machines: A bagged-weighted support vector model with free kernel choice

Improvement of statistical learning models in order to increase efficien...
research
07/23/2019

A Hardware-Efficient ADMM-Based SVM Training Algorithm for Edge Computing

This work demonstrates a hardware-efficient support vector machine (SVM)...

Please sign up or login with your details

Forgot password? Click here to reset