Fed-FSNet: Mitigating Non-I.I.D. Federated Learning via Fuzzy Synthesizing Network

08/21/2022
by   Jingcai Guo, et al.
12

Federated learning (FL) has emerged as a promising privacy-preserving distributed machine learning framework recently. It aims at collaboratively learning a shared global model by performing distributed training locally on edge devices and aggregating local models into a global one without centralized raw data sharing in the cloud server. However, due to the large local data heterogeneities (Non-I.I.D. data) across edge devices, the FL may easily obtain a global model that can produce more shifted gradients on local datasets, thereby degrading the model performance or even suffering from the non-convergence during training. In this paper, we propose a novel FL training framework, dubbed Fed-FSNet, using a properly designed Fuzzy Synthesizing Network (FSNet) to mitigate the Non-I.I.D. FL at-the-source. Concretely, we maintain an edge-agnostic hidden model in the cloud server to estimate a less-accurate while direction-aware inversion of the global model. The hidden model can then fuzzily synthesize several mimic I.I.D. data samples (sample features) conditioned on only the global model, which can be shared by edge devices to facilitate the FL training towards faster and better convergence. Moreover, since the synthesizing process involves neither access to the parameters/updates of local models nor analyzing individual local model outputs, our framework can still ensure the privacy of FL. Experimental results on several FL benchmarks demonstrate that our method can significantly mitigate the Non-I.I.D. issue and obtain better performance against other representative methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2022

Stochastic Coded Federated Learning: Theoretical Analysis and Incentive Mechanism Design

Federated learning (FL) has achieved great success as a privacy-preservi...
research
06/17/2020

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...
research
03/20/2023

FedML-HE: An Efficient Homomorphic-Encryption-Based Privacy-Preserving Federated Learning System

Federated Learning (FL) enables machine learning model training on distr...
research
08/18/2021

Fed-TGAN: Federated Learning Framework for Synthesizing Tabular Data

Generative Adversarial Networks (GANs) are typically trained to synthesi...
research
01/08/2023

Why Batch Normalization Damage Federated Learning on Non-IID Data?

As a promising distributed learning paradigm, federated learning (FL) in...
research
07/14/2020

Privacy Preserving Text Recognition with Gradient-Boosting for Federated Learning

Typical machine learning approaches require centralized data for model t...
research
05/24/2021

Fed-NILM: A Federated Learning-based Non-Intrusive Load Monitoring Method for Privacy-Protection

Non-intrusive load monitoring (NILM) is essential for understanding cust...

Please sign up or login with your details

Forgot password? Click here to reset