A Generic Online Parallel Learning Framework for Large Margin Models

03/02/2017
by   Shuming Ma, et al.
0

To speed up the training process, many existing systems use parallel technology for online learning algorithms. However, most research mainly focus on stochastic gradient descent (SGD) instead of other algorithms. We propose a generic online parallel learning framework for large margin models, and also analyze our framework on popular large margin algorithms, including MIRA and Structured Perceptron. Our framework is lock-free and easy to implement on existing systems. Experiments show that systems with our framework can gain near linear speed up by increasing running threads, and with no loss in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2014

Perceptron-like Algorithms and Generalization Bounds for Learning to Rank

Learning to rank is a supervised learning problem where the output space...
research
03/29/2015

Towards Shockingly Easy Structured Classification: A Search-based Probabilistic Online Learning Framework

There are two major approaches for structured classification. One is the...
research
05/02/2011

Rapid Learning with Stochastic Focus of Attention

We present a method to stop the evaluation of a decision making process ...
research
02/14/2023

Horocycle Decision Boundaries for Large Margin Classification in Hyperbolic Space

Hyperbolic spaces have been quite popular in the recent past for represe...
research
11/14/2017

pyLEMMINGS: Large Margin Multiple Instance Classification and Ranking for Bioinformatics Applications

Motivation: A major challenge in the development of machine learning bas...
research
08/04/2015

Perceptron like Algorithms for Online Learning to Rank

Perceptron is a classic online algorithm for learning a classification f...
research
02/05/2020

Online Passive-Aggressive Total-Error-Rate Minimization

We provide a new online learning algorithm which utilizes online passive...

Please sign up or login with your details

Forgot password? Click here to reset