Communication-Efficient Distributed Online Learning with Kernels

11/28/2019
by   Michael Kamp, et al.
0

We propose an efficient distributed online learning protocol for low-latency real-time services. It extends a previously presented protocol to kernelized online learners that represent their models by a support vector expansion. While such learners often achieve higher predictive performance than their linear counterparts, communicating the support vector expansions becomes inefficient for large numbers of support vectors. The proposed extension allows for a larger class of online learning algorithms—including those alleviating the problem above through model compression. In addition, we characterize the quality of the proposed protocol by introducing a novel criterion that requires the communication to be bounded by the loss suffered.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2019

Adaptive Communication Bounds for Distributed Online Learning

We consider distributed online learning protocols that control the excha...
research
11/03/2009

Slow Learners are Fast

Online learning algorithms have impressive convergence properties when i...
research
11/02/2011

Approximate Stochastic Subgradient Estimation Training for Support Vector Machines

Subgradient algorithms for training support vector machines have been qu...
research
04/22/2016

Approximation Vector Machines for Large-scale Online Learning

One of the most challenging problems in kernel online learning is to bou...
research
05/28/2022

History-Restricted Online Learning

We introduce the concept of history-restricted no-regret online learning...
research
11/17/2020

Distributed Online Learning with Multiple Kernels

In the Internet-of-Things (IoT) systems, there are plenty of informative...
research
07/09/2018

Efficient Decentralized Deep Learning by Dynamic Model Averaging

We propose an efficient protocol for decentralized training of deep neur...

Please sign up or login with your details

Forgot password? Click here to reset