Communication-Efficient ADMM-based Federated Learning

10/28/2021
by   Shenglong Zhou, et al.
0

Federated learning has shown its advances over the last few years but is facing many challenges, such as how algorithms save communication resources, how they reduce computational costs, and whether they converge. To address these issues, this paper proposes exact and inexact ADMM-based federated learning. They are not only communication-efficient but also converge linearly under very mild conditions, such as convexity-free and irrelevance to data distributions. Moreover, the inexact version has low computational complexity, thereby alleviating the computational burdens significantly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2022

Efficient and Convergent Federated Learning

Federated learning has shown its advances over the last few years but is...
research
10/14/2017

Robust Federated Learning Using ADMM in the Presence of Data Falsifying Byzantines

In this paper, we consider the problem of federated (or decentralized) l...
research
04/22/2022

Federated Learning via Inexact ADMM

One of the crucial issues in federated learning is how to develop effici...
research
11/09/2019

L-FGADMM: Layer-Wise Federated Group ADMM for Communication Efficient Decentralized Deep Learning

This article proposes a communication-efficient decentralized deep learn...
research
08/23/2022

Exact Penalty Method for Federated Learning

Federated learning has burgeoned recently in machine learning, giving ri...
research
12/09/2020

Accurate and Fast Federated Learning via IID and Communication-Aware Grouping

Federated learning has emerged as a new paradigm of collaborative machin...
research
09/29/2022

Federated Stain Normalization for Computational Pathology

Although deep federated learning has received much attention in recent y...

Please sign up or login with your details

Forgot password? Click here to reset