Communication-Efficient Federated Learning with Compensated Overlap-FedAvg

12/12/2020
by   Yuhao Zhou, et al.
4

Petabytes of data are generated each day by emerging Internet of Things (IoT), but only few of them can be finally collected and used for Machine Learning (ML) purposes due to the apprehension of data privacy leakage, which seriously retarding ML's growth. To alleviate this problem, Federated learning is proposed to perform model training by multiple clients' combined data without the dataset sharing within the cluster. Nevertheless, federated learning introduces massive communication overhead as the synchronized data in each epoch is of the same size as the model, and thereby leading to a low communication efficiency. Consequently, variant methods mainly focusing on the communication rounds reduction and data compression are proposed to reduce the communication overhead of federated learning. In this paper, we propose Overlap-FedAvg, a framework that parallels the model training phase with model uploading downloading phase, so that the latter phase can be totally covered by the former phase. Compared to vanilla FedAvg, Overlap-FedAvg is further developed with a hierarchical computing strategy, a data compensation mechanism and a nesterov accelerated gradients (NAG) algorithm. Besides, Overlap-FedAvg is orthogonal to many other compression methods so that they can be applied together to maximize the utilization of the cluster. Furthermore, the theoretical analysis is provided to prove the convergence of the proposed Overlap-FedAvg framework. Extensive experiments on both conventional and recurrent tasks with multiple models and datasets also demonstrate that the proposed Overlap-FedAvg framework substantially boosts the federated learning process.

READ FULL TEXT

page 1

page 5

page 10

page 12

research
05/05/2022

Communication-Efficient Adaptive Federated Learning

Federated learning is a machine learning training paradigm that enables ...
research
11/08/2020

Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning

With more regulations tackling users' privacy-sensitive data protection ...
research
03/19/2021

An Experiment Study on Federated LearningTestbed

While the Internet of Things (IoT) can benefit from machine learning by ...
research
02/22/2021

Multiple Kernel-Based Online Federated Learning

Online federated learning (OFL) becomes an emerging learning framework, ...
research
09/03/2022

Suppressing Noise from Built Environment Datasets to Reduce Communication Rounds for Convergence of Federated Learning

Smart sensing provides an easier and convenient data-driven mechanism fo...
research
07/29/2022

Towards Communication-efficient Vertical Federated Learning Training via Cache-enabled Local Updates

Vertical federated learning (VFL) is an emerging paradigm that allows di...
research
02/01/2022

Recycling Model Updates in Federated Learning: Are Gradient Subspaces Low-Rank?

In this paper, we question the rationale behind propagating large number...

Please sign up or login with your details

Forgot password? Click here to reset