FedLGA: Towards System-Heterogeneity of Federated Learning via Local Gradient Approximation

12/22/2021
by   Xingyu Li, et al.
0

Federated Learning (FL) is a decentralized machine learning architecture, which leverages a large number of remote devices to learn a joint model with distributed training data. However, the system-heterogeneity is one major challenge in a FL network to achieve robust distributed learning performance, which is of two aspects: i) device-heterogeneity due to the diverse computational capacity among devices; ii) data-heterogeneity due to the non-identically distributed data across the network. Though there have been benchmarks against the heterogeneous FL, e.g., FedProx, the prior studies lack formalization and it remains an open problem. In this work, we formalize the system-heterogeneous FL problem and propose a new algorithm, called FedLGA, which addresses this problem by bridging the divergence of local model updates via gradient approximation. To achieve this, FedLGA provides an alternated Hessian estimation method, which only requires extra linear complexity on the aggregator. Theoretically, we show that with a device-heterogeneous ratio ρ, FedLGA achieves convergence rates on non-i.i.d distributed FL training data against non-convex optimization problems for 𝒪( (1+ρ)/√(ENT) + 1/T) and 𝒪( (1+ρ)√(E)/√(TK) + 1/T) for full and partial device participation respectively, where E is the number of local learning epoch, T is the number of total communication round, N is the total device number and K is the number of selected device in one communication round under partially participation scheme. The results of comprehensive experiments on multiple datasets show that FedLGA outperforms current FL benchmarks against the system-heterogeneity.

READ FULL TEXT

page 11

page 12

research
02/23/2022

FedCAT: Towards Accurate Federated Learning via Device Concatenation

As a promising distributed machine learning paradigm, Federated Learning...
research
03/01/2021

Heterogeneity for the Win: One-Shot Federated Clustering

In this work, we explore the unique challenges – and opportunities – of ...
research
02/11/2022

A Newton-type algorithm for federated learning based on incremental Hessian eigenvector sharing

There is a growing interest in the decentralized optimization framework ...
research
01/01/2023

Efficient On-device Training via Gradient Filtering

Despite its importance for federated learning, continuous learning and m...
research
02/08/2022

Learnings from Federated Learning in the Real world

Federated Learning (FL) applied to real world data may suffer from sever...
research
06/11/2023

FedDec: Peer-to-peer Aided Federated Learning

Federated learning (FL) has enabled training machine learning models exp...
research
04/07/2020

FedMAX: Mitigating Activation Divergence for Accurate and Communication-Efficient Federated Learning

In this paper, we identify a new phenomenon called activation-divergence...

Please sign up or login with your details

Forgot password? Click here to reset