FedADMM: A Robust Federated Deep Learning Framework with Adaptivity to System Heterogeneity

04/07/2022
by   Yonghai Gong, et al.
0

Federated Learning (FL) is an emerging framework for distributed processing of large data volumes by edge devices subject to limited communication bandwidths, heterogeneity in data distributions and computational resources, as well as privacy considerations. In this paper, we introduce a new FL protocol termed FedADMM based on primal-dual optimization. The proposed method leverages dual variables to tackle statistical heterogeneity, and accommodates system heterogeneity by tolerating variable amount of work performed by clients. FedADMM maintains identical communication costs per round as FedAvg/Prox, and generalizes them via the augmented Lagrangian. A convergence proof is established for nonconvex objectives, under no restrictions in terms of data dissimilarity or number of participants per round of the algorithm. We demonstrate the merits through extensive experiments on real datasets, under both IID and non-IID data distributions across clients. FedADMM consistently outperforms all baseline methods in terms of communication efficiency, with the number of rounds needed to reach a prescribed accuracy reduced by up to 87 The algorithm effectively adapts to heterogeneous data distributions through the use of dual variables, without the need for hyperparameter tuning, and its advantages are more pronounced in large-scale systems.

READ FULL TEXT
research
03/30/2023

DPP-based Client Selection for Federated Learning with Non-IID Data

This paper proposes a client selection (CS) method to tackle the communi...
research
07/12/2023

Tackling Computational Heterogeneity in FL: A Few Theoretical Insights

The future of machine learning lies in moving data collection along with...
research
10/29/2022

Auxo: Heterogeneity-Mitigating Federated Learning via Scalable Client Clustering

Federated learning (FL) is an emerging machine learning (ML) paradigm th...
research
09/13/2023

Tackling the Non-IID Issue in Heterogeneous Federated Learning by Gradient Harmonization

Federated learning (FL) is a privacy-preserving paradigm for collaborati...
research
01/31/2022

Heterogeneous Federated Learning via Grouped Sequential-to-Parallel Training

Federated learning (FL) is a rapidly growing privacy-preserving collabor...
research
12/03/2022

Beyond ADMM: A Unified Client-variance-reduced Adaptive Federated Learning Framework

As a novel distributed learning paradigm, federated learning (FL) faces ...
research
12/18/2019

Primal-dual optimization methods for large-scale and distributed data analytics

The augmented Lagrangian method (ALM) is a classical optimization tool t...

Please sign up or login with your details

Forgot password? Click here to reset