NESVM: a Fast Gradient Method for Support Vector Machines

08/24/2010
by   Tianyi Zhou, et al.
0

Support vector machines (SVMs) are invaluable tools for many practical applications in artificial intelligence, e.g., classification and event recognition. However, popular SVM solvers are not sufficiently efficient for applications with a great deal of samples as well as a large number of features. In this paper, thus, we present NESVM, a fast gradient SVM solver that can optimize various SVM models, e.g., classical SVM, linear programming SVM and least square SVM. Compared against SVM-Perf SVM_PerfPerfML (its convergence rate in solving the dual SVM is upper bounded by O(1/√(k)), wherein k is the number of iterations.) and Pegasos Pegasos (online SVM that converges at rate O(1/k) for the primal SVM), NESVM achieves the optimal convergence rate at O(1/k^2) and a linear time complexity. In particular, NESVM smoothes the non-differentiable hinge loss and ℓ_1-norm in the primal SVM. Then the optimal gradient method without any line search is adopted to solve the optimization. In each iteration round, the current gradient and historical gradients are combined to determine the descent direction, while the Lipschitz constant determines the step size. Only two matrix-vector multiplications are required in each iteration round. Therefore, NESVM is more efficient than existing SVM solvers. In addition, NESVM is available for both linear and nonlinear kernels. We also propose "homotopy NESVM" to accelerate NESVM by dynamically decreasing the smooth parameter and using the continuation method. Our experiments on census income categorization, indoor/outdoor scene classification, event recognition and scene recognition suggest the efficiency and the effectiveness of NESVM. The MATLAB code of NESVM will be available on our website for further assessment.

READ FULL TEXT

page 8

page 9

research
12/05/2018

GADGET SVM: A Gossip-bAseD sub-GradiEnT Solver for Linear SVMs

In the era of big data, an important weapon in a machine learning resear...
research
07/21/2020

A Semismooth-Newton's-Method-Based Linearization and Approximation Approach for Kernel Support Vector Machines

Support Vector Machines (SVMs) are among the most popular and the best p...
research
11/02/2011

Approximate Stochastic Subgradient Estimation Training for Support Vector Machines

Subgradient algorithms for training support vector machines have been qu...
research
07/19/2012

Block-Coordinate Frank-Wolfe Optimization for Structural SVMs

We propose a randomized block-coordinate variant of the classic Frank-Wo...
research
06/26/2018

Dual SVM Training on a Budget

We present a dual subspace ascent algorithm for support vector machine t...
research
07/18/2023

Enhancing Pattern Classification in Support Vector Machines through Matrix Formulation

Support Vector Machines (SVM) have gathered significant acclaim as class...
research
04/03/2013

A Novel Frank-Wolfe Algorithm. Analysis and Applications to Large-Scale SVM Training

Recently, there has been a renewed interest in the machine learning comm...

Please sign up or login with your details

Forgot password? Click here to reset