OFedQIT: Communication-Efficient Online Federated Learning via Quantization and Intermittent Transmission

05/13/2022
by   Jonghwan Park, et al.
0

Online federated learning (OFL) is a promising framework to collaboratively learn a sequence of non-linear functions (or models) from distributed streaming data incoming to multiple clients while keeping the privacy of their local data. In this framework, we first construct a vanilla method (named OFedAvg) by incorporating online gradient descent (OGD) into the de facto aggregation method (named FedAvg). Despite its optimal asymptotic performance, OFedAvg suffers from heavy communication overhead and long learning delay. To tackle these shortcomings, we propose a communication-efficient OFL algorithm (named OFedQIT) by means of a stochastic quantization and an intermittent transmission. Our major contribution is to theoretically prove that OFedQIT over T time slots can achieve an optimal sublinear regret bound 𝒪(√(T)) for any real data (including non-IID data) while significantly reducing the communication overhead. Furthermore, this optimality is still guaranteed even when a small fraction of clients (having faster processing time and high-quality communication channel) in a network are participated at once. Our analysis reveals that OFedQIT successfully addresses the drawbacks of OFedAvg while maintaining superior learning accuracy. Experiments with real datasets demonstrate the effectiveness of our algorithm on various online classification and regression tasks.

READ FULL TEXT
research
12/15/2020

CosSGD: Nonlinear Quantization for Communication-efficient Federated Learning

Federated learning facilitates learning across clients without transferr...
research
02/22/2021

Multiple Kernel-Based Online Federated Learning

Online federated learning (OFL) becomes an emerging learning framework, ...
research
10/13/2021

Communication-Efficient Online Federated Learning Framework for Nonlinear Regression

Federated learning (FL) literature typically assumes that each client ha...
research
11/29/2021

SPATL: Salient Parameter Aggregation and Transfer Learning for Heterogeneous Clients in Federated Learning

Efficient federated learning is one of the key challenges for training a...
research
09/14/2023

Communication Efficient Private Federated Learning Using Dithering

The task of preserving privacy while ensuring efficient communication is...
research
02/27/2022

Graph-Assisted Communication-Efficient Ensemble Federated Learning

Communication efficiency arises as a necessity in federated learning due...
research
01/14/2020

Adaptive Gradient Sparsification for Efficient Federated Learning: An Online Learning Approach

Federated learning (FL) is an emerging technique for training machine le...

Please sign up or login with your details

Forgot password? Click here to reset