Accelerating Federated Learning via Momentum Gradient Descent

by   Wei Liu, et al.

Federated learning (FL) provides a communication-efficient approach to solve machine learning problems concerning distributed data, without sending raw data to a central server. However, existing works on FL only utilize first-order gradient descent (GD) and do not consider the preceding iterations to gradient update which can potentially accelerate convergence. In this paper, we consider momentum term which relates to the last iteration. The proposed momentum federated learning (MFL) uses momentum gradient descent (MGD) in the local update step of FL system. We establish global convergence properties of MFL and derive an upper bound on MFL convergence rate. Comparing the upper bounds on MFL and FL convergence rate, we provide conditions in which MFL accelerates the convergence. For different machine learning models, the convergence performance of MFL is evaluated based on experiments with MNIST dataset. Simulation results comfirm that MFL is globally convergent and further reveal significant convergence improvement over FL.



page 5


Federated Learning with Nesterov Accelerated Gradient Momentum Method

Federated learning (FL) is a fast-developing technique that allows multi...

Neural Tangent Kernel Empowered Federated Learning

Federated learning (FL) is a privacy-preserving paradigm where multiple ...

Federated Stochastic Gradient Descent Begets Self-Induced Momentum

Federated learning (FL) is an emerging machine learning method that can ...

Local Stochastic Bilevel Optimization with Momentum-Based Variance Reduction

Bilevel Optimization has witnessed notable progress recently with new em...

Federated Learning on Riemannian Manifolds

Federated learning (FL) has found many important applications in smart-p...

An Efficient Framework for Clustered Federated Learning

We address the problem of Federated Learning (FL) where users are distri...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.