Gossiped and Quantized Online Multi-Kernel Learning

01/24/2023
by   Tomas Ortega, et al.
0

In instances of online kernel learning where little prior information is available and centralized learning is unfeasible, past research has shown that distributed and online multi-kernel learning provides sub-linear regret as long as every pair of nodes in the network can communicate (i.e., the communications network is a complete graph). In addition, to manage the communication load, which is often a performance bottleneck, communications between nodes can be quantized. This letter expands on these results to non-fully connected graphs, which is often the case in wireless sensor networks. To address this challenge, we propose a gossip algorithm and provide a proof that it achieves sub-linear regret. Experiments with real datasets confirm our findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/04/2022

QC-ODKLA: Quantized and Communication-Censored Online Decentralized Kernel Learning via Linearized ADMM

This paper focuses on online kernel learning over a decentralized networ...
research
07/16/2022

Online Prediction in Sub-linear Space

We provide the first sub-linear space and sub-linear regret algorithm fo...
research
10/20/2020

POND: Pessimistic-Optimistic oNline Dispatch

This paper considers constrained online dispatch with unknown arrival, r...
research
02/27/2018

Online learning with kernel losses

We present a generalization of the adversarial linear bandits framework,...
research
02/28/2018

RRR: Rank-Regret Representative

We propose the rank-regret representative as a way of choosing a small s...
research
02/09/2021

Graph-Aided Online Multi-Kernel Learning

Multi-kernel learning (MKL) has been widely used in function approximati...

Please sign up or login with your details

Forgot password? Click here to reset