Momentum Benefits Non-IID Federated Learning Simply and Provably

06/28/2023
by   Ziheng Cheng, et al.
0

Federated learning is a powerful paradigm for large-scale machine learning, but it faces significant challenges due to unreliable network connections, slow communication, and substantial data heterogeneity across clients. FedAvg and SCAFFOLD are two fundamental algorithms to address these challenges. In particular, FedAvg employs multiple local updates before communicating with a central server, while SCAFFOLD maintains a control variable on each client to compensate for "client drift" in its local updates. Various methods have been proposed in literature to enhance the convergence of these two algorithms, but they either make impractical adjustments to algorithmic structure, or rely on the assumption of bounded data heterogeneity. This paper explores the utilization of momentum to enhance the performance of FedAvg and SCAFFOLD. When all clients participate in the training process, we demonstrate that incorporating momentum allows FedAvg to converge without relying on the assumption of bounded data heterogeneity even using a constant local learning rate. This is a novel result since existing analyses for FedAvg require bounded data heterogeneity even with diminishing local learning rates. In the case of partial client participation, we show that momentum enables SCAFFOLD to converge provably faster without imposing any additional assumptions. Furthermore, we use momentum to develop new variance-reduced extensions of FedAvg and SCAFFOLD, which exhibit state-of-the-art convergence rates. Our experimental results support all theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2020

Improved Convergence Rates for Non-Convex Federated Learning with Compression

Federated learning is a new distributed learning paradigm that enables e...
research
06/21/2021

FedCM: Federated Learning with Client-level Momentum

Federated Learning is a distributed machine learning approach which enab...
research
07/15/2020

FetchSGD: Communication-Efficient Federated Learning with Sketching

Existing approaches to federated learning suffer from a communication bo...
research
08/08/2020

Mime: Mimicking Centralized Stochastic Algorithms in Federated Learning

Federated learning is a challenging optimization problem due to the hete...
research
06/02/2023

Federated Multi-Sequence Stochastic Approximation with Local Hypergradient Estimation

Stochastic approximation with multiple coupled sequences (MSA) has found...
research
02/08/2021

Double Momentum SGD for Federated Learning

Communication efficiency is crucial in federated learning. Conducting ma...
research
12/05/2022

Partial Variance Reduction improves Non-Convex Federated learning on heterogeneous data

Data heterogeneity across clients is a key challenge in federated learni...

Please sign up or login with your details

Forgot password? Click here to reset