Communication-Efficient Diffusion Strategy for Performance Improvement of Federated Learning with Non-IID Data

07/15/2022
by   Seyoung Ahn, et al.
0

Federated learning (FL) is a novel learning paradigm that addresses the privacy leakage challenge of centralized learning. However, in FL, users with non-independent and identically distributed (non-IID) characteristics can deteriorate the performance of the global model. Specifically, the global model suffers from the weight divergence challenge owing to non-IID data. To address the aforementioned challenge, we propose a novel diffusion strategy of the machine learning (ML) model (FedDif) to maximize the FL performance with non-IID data. In FedDif, users spread local models to neighboring users over D2D communications. FedDif enables the local model to experience different distributions before parameter aggregation. Furthermore, we theoretically demonstrate that FedDif can circumvent the weight divergence challenge. On the theoretical basis, we propose the communication-efficient diffusion strategy of the ML model, which can determine the trade-off between the learning performance and communication cost based on auction theory. The performance evaluation results show that FedDif improves the test accuracy of the global model by 11 FedDif improves communication efficiency in perspective of the number of transmitted sub-frames and models by 2.77 folds than the latest methods

READ FULL TEXT

page 1

page 4

page 14

page 15

page 18

research
02/23/2022

FedCAT: Towards Accurate Federated Learning via Device Concatenation

As a promising distributed machine learning paradigm, Federated Learning...
research
05/26/2020

Continual Local Training for Better Initialization of Federated Models

Federated learning (FL) refers to the learning paradigm that trains mach...
research
02/16/2022

Single-shot Hyper-parameter Optimization for Federated Learning: A General Algorithm Analysis

We address the relatively unexplored problem of hyper-parameter optimiza...
research
04/06/2020

Evaluating the Communication Efficiency in Federated Learning Algorithms

In the era of advanced technologies, mobile devices are equipped with co...
research
01/30/2023

Regularized Weight Aggregation in Networked Federated Learning for Glioblastoma Segmentation

In federated learning (FL), the global model at the server requires an e...
research
12/22/2022

Model Segmentation for Storage Efficient Private Federated Learning with Top r Sparsification

In federated learning (FL) with top r sparsification, millions of users ...
research
06/17/2022

Avoid Overfitting User Specific Information in Federated Keyword Spotting

Keyword spotting (KWS) aims to discriminate a specific wake-up word from...

Please sign up or login with your details

Forgot password? Click here to reset