Why Batch Normalization Damage Federated Learning on Non-IID Data?

01/08/2023
by   Yanmeng Wang, et al.
0

As a promising distributed learning paradigm, federated learning (FL) involves training deep neural network (DNN) models at the network edge while protecting the privacy of the edge clients. To train a large-scale DNN model, batch normalization (BN) has been regarded as a simple and effective means to accelerate the training and improve the generalization capability. However, recent findings indicate that BN can significantly impair the performance of FL in the presence of non-i.i.d. data. While several FL algorithms have been proposed to address this issue, their performance still falls significantly when compared to the centralized scheme. Furthermore, none of them have provided a theoretical explanation of how the BN damages the FL convergence. In this paper, we present the first convergence analysis to show that under the non-i.i.d. data, the mismatch between the local and global statistical parameters in BN causes the gradient deviation between the local and global models, which, as a result, slows down and biases the FL convergence. In view of this, we develop a new FL algorithm that is tailored to BN, called FedTAN, which is capable of achieving robust FL performance under a variety of data distributions via iterative layer-wise parameter aggregation. Comprehensive experimental results demonstrate the superiority of the proposed FedTAN over existing baselines for training BN-based DNN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Is Normalization Indispensable for Multi-domain Federated Learning?

Federated learning (FL) enhances data privacy with collaborative in-situ...
research
07/26/2021

Aggregate or Not? Exploring Where to Privatize in DNN Based Federated Learning Under Different Non-IID Scenes

Although federated learning (FL) has recently been proposed for efficien...
research
03/12/2023

Making Batch Normalization Great in Federated Deep Learning

Batch Normalization (BN) is commonly used in modern deep neural networks...
research
03/19/2023

Experimenting with Normalization Layers in Federated Learning on non-IID scenarios

Training Deep Learning (DL) models require large, high-quality datasets,...
research
08/18/2023

Normalization Is All You Need: Understanding Layer-Normalized Federated Learning under Extreme Label Shift

Layer normalization (LN) is a widely adopted deep learning technique esp...
research
03/10/2023

Optimizing Federated Learning for Medical Image Classification on Distributed Non-iid Datasets with Partial Labels

Numerous large-scale chest x-ray datasets have spearheaded expert-level ...
research
08/21/2022

Fed-FSNet: Mitigating Non-I.I.D. Federated Learning via Fuzzy Synthesizing Network

Federated learning (FL) has emerged as a promising privacy-preserving di...

Please sign up or login with your details

Forgot password? Click here to reset