FedADC: Accelerated Federated Learning with Drift Control

12/16/2020
by   Emre Ozfatura, et al.
11

Federated learning (FL) has become de facto framework for collaborative learning among edge devices with privacy concern. The core of the FL strategy is the use of stochastic gradient descent (SGD) in a distributed manner. Large scale implementation of FL brings new challenges, such as the incorporation of acceleration techniques designed for SGD into the distributed setting, and mitigation of the drift problem due to non-homogeneous distribution of local datasets. These two problems have been separately studied in the literature; whereas, in this paper, we show that it is possible to address both problems using a single strategy without any major alteration to the FL framework, or introducing additional computation and communication load. To achieve this goal, we propose FedADC, which is an accelerated FL algorithm with drift control. We empirically illustrate the advantages of FedADC.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/15/2021

FLoRA: Single-shot Hyper-parameter Optimization for Federated Learning

We address the relatively unexplored problem of hyper-parameter optimiza...
09/18/2020

Federated Learning with Nesterov Accelerated Gradient Momentum Method

Federated learning (FL) is a fast-developing technique that allows multi...
12/22/2021

Evolution and trade-off dynamics of functional load

Function Load (FL) quantifies the contributions by phonological contrast...
02/19/2021

Personalized Federated Learning: A Unified Framework and Universal Optimization Techniques

We study the optimization aspects of personalized Federated Learning (FL...
09/01/2021

Asynchronous Federated Learning for Sensor Data with Concept Drift

Federated learning (FL) involves multiple distributed devices jointly tr...
02/25/2020

Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees

Federated learning (FL) has emerged as a prominent distributed learning ...
01/06/2021

Federated Learning over Noisy Channels: Convergence Analysis and Design Examples

Does Federated Learning (FL) work when both uplink and downlink communic...