GADGET SVM: A Gossip-bAseD sub-GradiEnT Solver for Linear SVMs

12/05/2018
by   Haimonti Dutta, et al.
0

In the era of big data, an important weapon in a machine learning researcher's arsenal is a scalable Support Vector Machine (SVM) algorithm. SVMs are extensively used for solving classification problems. Traditional algorithms for learning SVMs often scale super linearly with training set size which becomes infeasible very quickly for large data sets. In recent years, scalable algorithms have been designed which study the primal or dual formulations of the problem. This often suggests a way to decompose the problem and facilitate development of distributed algorithms. In this paper, we present a distributed algorithm for learning linear Support Vector Machines in the primal form for binary classification called Gossip-bAseD sub-GradiEnT (GADGET) SVM. The algorithm is designed such that it can be executed locally on nodes of a distributed system. Each node processes its local homogeneously partitioned data and learns a primal SVM model. It then gossips with random neighbors about the classifier learnt and uses this information to update the model. Extensive theoretical and empirical results suggest that this anytime algorithm has performance comparable to its centralized and online counterparts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2014

Generalized version of the support vector machine for binary classification problems: supporting hyperplane machine

In this paper there is proposed a generalized version of the SVM for bin...
research
08/24/2010

NESVM: a Fast Gradient Method for Support Vector Machines

Support vector machines (SVMs) are invaluable tools for many practical a...
research
02/15/2020

On Coresets for Support Vector Machines

We present an efficient coreset construction algorithm for large-scale S...
research
04/01/2021

Distributed support-vector-machine over dynamic balanced directed networks

In this paper, we consider the binary classification problem via distrib...
research
05/11/2020

A Relational Gradient Descent Algorithm For Support Vector Machine Training

We consider gradient descent like algorithms for Support Vector Machine ...
research
02/15/2021

A generalized quadratic loss for SVM and Deep Neural Networks

We consider some supervised binary classification tasks and a regression...
research
04/23/2013

The Stochastic Gradient Descent for the Primal L1-SVM Optimization Revisited

We reconsider the stochastic (sub)gradient approach to the unconstrained...

Please sign up or login with your details

Forgot password? Click here to reset