FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity to Non-IID Data

05/22/2020
by   Xinwei Zhang, et al.
8

Federated Learning (FL) has become a popular paradigm for learning from distributed data. To effectively utilize data at different devices without moving them to the cloud, algorithms such as the Federated Averaging (FedAvg) have adopted a "computation then aggregation" (CTA) model, in which multiple local updates are performed using local data, before sending the local models to the cloud for aggregation. However, these schemes typically require strong assumptions, such as the local data are identically independent distributed (i.i.d), or the size of the local gradients are bounded. In this paper, we first explicitly characterize the behavior of the FedAvg algorithm, and show that without strong and unrealistic assumptions on the problem structure, the algorithm can behave erratically for non-convex problems (e.g., diverge to infinity). Aiming at designing FL algorithms that are provably fast and require as few assumptions as possible, we propose a new algorithm design strategy from the primal-dual optimization perspective. Our strategy yields a family of algorithms that take the same CTA model as existing algorithms, but they can deal with the non-convex objective, achieve the best possible optimization and communication complexity while being able to deal with both the full batch and mini-batch local computation models. Most importantly, the proposed algorithms are communication efficient, in the sense that the communication pattern can be adaptive to the level of heterogeneity among the local data. To the best of our knowledge, this is the first algorithmic framework for FL that achieves all the above properties.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/16/2019

Edge-Assisted Hierarchical Federated Learning with Non-IID Data

Federated Learning (FL) is capable of leveraging massively distributed p...
research
06/29/2021

Achieving Statistical Optimality of Federated Learning: Beyond Stationary Points

Federated Learning (FL) is a promising framework that has great potentia...
research
06/22/2022

: Calibrating Global and Local Models via Federated Learning Beyond Consensus

In federated learning (FL), the objective of collaboratively learning a ...
research
01/14/2020

Distributed Learning in the Non-Convex World: From Batch to Streaming Data, and Beyond

Distributed learning has become a critical enabler of the massively conn...
research
07/17/2022

Fast Composite Optimization and Statistical Recovery in Federated Learning

As a prevalent distributed learning paradigm, Federated Learning (FL) tr...
research
06/19/2021

STEM: A Stochastic Two-Sided Momentum Algorithm Achieving Near-Optimal Sample and Communication Complexities for Federated Learning

Federated Learning (FL) refers to the paradigm where multiple worker nod...
research
10/25/2021

Optimization-Based GenQSGD for Federated Edge Learning

Optimal algorithm design for federated learning (FL) remains an open pro...

Please sign up or login with your details

Forgot password? Click here to reset