DeepAI AI Chat
Log In Sign Up

FedCM: Federated Learning with Client-level Momentum

by   Jing Xu, et al.

Federated Learning is a distributed machine learning approach which enables model training without data sharing. In this paper, we propose a new federated learning algorithm, Federated Averaging with Client-level Momentum (FedCM), to tackle problems of partial participation and client heterogeneity in real-world federated learning applications. FedCM aggregates global gradient information in previous communication rounds and modifies client gradient descent with a momentum-like term, which can effectively correct the bias and improve the stability of local SGD. We provide theoretical analysis to highlight the benefits of FedCM. We also perform extensive empirical studies and demonstrate that FedCM achieves superior performance in various tasks and is robust to different levels of client numbers, participation rate and client heterogeneity.


page 1

page 2

page 3

page 4


Double Momentum SGD for Federated Learning

Communication efficiency is crucial in federated learning. Conducting ma...

FedSkip: Combatting Statistical Heterogeneity with Federated Skip Aggregation

The statistical heterogeneity of the non-independent and identically dis...

Blockchain-based Federated Learning for Device Failure Detection in Industrial IoT

Device failure detection is one of most essential problems in industrial...

Mime: Mimicking Centralized Stochastic Algorithms in Federated Learning

Federated learning is a challenging optimization problem due to the hete...

Federated Clustering via Matrix Factorization Models: From Model Averaging to Gradient Sharing

Recently, federated learning (FL) has drawn significant attention due to...

Federated Adaptation of Reservoirs via Intrinsic Plasticity

We propose a novel algorithm for performing federated learning with Echo...

Mitigating Data Heterogeneity in Federated Learning with Data Augmentation

Federated Learning (FL) is a prominent framework that enables training a...