Heterogeneous Federated Learning on a Graph

09/19/2022
by   Huiyuan Wang, et al.
9

Federated learning, where algorithms are trained across multiple decentralized devices without sharing local data, is increasingly popular in distributed machine learning practice. Typically, a graph structure G exists behind local devices for communication. In this work, we consider parameter estimation in federated learning with data distribution and communication heterogeneity, as well as limited computational capacity of local devices. We encode the distribution heterogeneity by parametrizing distributions on local devices with a set of distinct p-dimensional vectors. We then propose to jointly estimate parameters of all devices under the M-estimation framework with the fused Lasso regularization, encouraging an equal estimate of parameters on connected devices in G. We provide a general result for our estimator depending on G, which can be further calibrated to obtain convergence rates for various specific problem setups. Surprisingly, our estimator attains the optimal rate under certain graph fidelity condition on G, as if we could aggregate all samples sharing the same distribution. If the graph fidelity condition is not met, we propose an edge selection procedure via multiple testing to ensure the optimality. To ease the burden of local computation, a decentralized stochastic version of ADMM is provided, with convergence rate O(T^-1log T) where T denotes the number of iterations. We highlight that, our algorithm transmits only parameters along edges of G at each iteration, without requiring a central machine, which preserves privacy. We further extend it to the case where devices are randomly inaccessible during the training process, with a similar algorithmic convergence guarantee. The computational and statistical efficiency of our method is evidenced by simulation experiments and the 2020 US presidential election data set.

READ FULL TEXT
research
03/30/2020

CPFed: Communication-Efficient and Privacy-Preserving Federated Learning

Federated learning is a machine learning setting where a set of edge dev...
research
06/17/2021

Towards Heterogeneous Clients with Elastic Federated Learning

Federated learning involves training machine learning models over device...
research
12/05/2022

Adaptive Configuration for Heterogeneous Participants in Decentralized Federated Learning

Data generated at the network edge can be processed locally by leveragin...
research
08/20/2021

FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

Federated learning aims to protect users' privacy while performing data ...
research
05/26/2022

Cali3F: Calibrated Fast Fair Federated Recommendation System

The increasingly stringent regulations on privacy protection have sparke...
research
08/16/2023

DFedADMM: Dual Constraints Controlled Model Inconsistency for Decentralized Federated Learning

To address the communication burden issues associated with federated lea...
research
02/18/2022

ProxSkip: Yes! Local Gradient Steps Provably Lead to Communication Acceleration! Finally!

We introduce ProxSkip – a surprisingly simple and provably efficient met...

Please sign up or login with your details

Forgot password? Click here to reset