ISFL: Trustworthy Federated Learning for Non-i.i.d. Data with Local Importance Sampling

10/05/2022
by   Zheqi Zhu, et al.
0

As a promising integrated computation and communication learning paradigm, federated learning (FL) carries a periodic sharing from distributed clients. Due to the non-i.i.d. data distribution on clients, FL model suffers from the gradient diversity, poor performance, bad convergence, etc. In this work, we aim to tackle this key issue by adopting data-driven importance sampling (IS) for local training. We propose a trustworthy framework, named importance sampling federated learning (ISFL), which is especially compatible with neural network (NN) models. The framework is evaluated both theoretically and experimentally. Firstly, we derive the parameter deviation bound between ISFL and the centralized full-data training to identify the main factors of the non-i.i.d. dilemmas. We will then formulate the selection of optimal IS weights as an optimization problem and obtain theoretical solutions. We also employ water-filling methods to calculate the IS weights and develop the complete ISFL algorithms. The experimental results on CIFAR-10 fit our proposed theories well and prove that ISFL reaps higher performance, as well as better convergence on non-i.i.d. data. To the best of our knowledge, ISFL is the first non-i.i.d. FL solution from the local sampling aspect which exhibits theoretical NN compatibility. Furthermore, as a local sampling approach, ISFL can be easily migrated into emerging FL frameworks.

READ FULL TEXT

page 1

page 10

research
06/06/2022

Generalized Federated Learning via Sharpness Aware Minimization

Federated Learning (FL) is a promising framework for performing privacy-...
research
05/12/2021

Clustered Sampling: Low-Variance and Improved Representativity for Clients Selection in Federated Learning

This work addresses the problem of optimizing communications between ser...
research
10/26/2020

Optimal Importance Sampling for Federated Learning

Federated learning involves a mixture of centralized and decentralized p...
research
12/14/2020

Federated Learning under Importance Sampling

Federated learning encapsulates distributed learning strategies that are...
research
05/23/2022

FL Games: A federated learning framework for distribution shifts

Federated learning aims to train predictive models for data that is dist...
research
01/28/2023

CyclicFL: A Cyclic Model Pre-Training Approach to Efficient Federated Learning

Since random initial models in Federated Learning (FL) can easily result...
research
02/19/2023

Magnitude Matters: Fixing SIGNSGD Through Magnitude-Aware Sparsification in the Presence of Data Heterogeneity

Communication overhead has become one of the major bottlenecks in the di...

Please sign up or login with your details

Forgot password? Click here to reset