DeepAI AI Chat
Log In Sign Up

Making Batch Normalization Great in Federated Deep Learning

by   Jike Zhong, et al.

Batch Normalization (BN) is commonly used in modern deep neural networks (DNNs) to improve stability and speed up convergence during centralized training. In federated learning (FL) with non-IID decentralized data, previous works observed that training with BN could hinder performance due to the mismatch of the BN statistics between training and testing. Group Normalization (GN) is thus more often used in FL as an alternative to BN. However, from our empirical study across various FL settings, we see no consistent winner between BN and GN. This leads us to revisit the use of normalization layers in FL. We find that with proper treatments, BN can be highly competitive across a wide range of FL settings, and this requires no additional training or communication costs. We hope that our study could serve as a valuable reference for future practical usage and theoretical analysis in FL.


Experimenting with Normalization Layers in Federated Learning on non-IID scenarios

Training Deep Learning (DL) models require large, high-quality datasets,...

Why Batch Normalization Damage Federated Learning on Non-IID Data?

As a promising distributed learning paradigm, federated learning (FL) in...

Is Normalization Indispensable for Multi-domain Federated Learning?

Federated learning (FL) enhances data privacy with collaborative in-situ...

Normalization Is All You Need: Understanding Layer-Normalized Federated Learning under Extreme Label Shift

Layer normalization (LN) is a widely adopted deep learning technique esp...

The Non-IID Data Quagmire of Decentralized Machine Learning

Many large-scale machine learning (ML) applications need to train ML mod...

Federated Robustness Propagation: Sharing Adversarial Robustness in Federated Learning

Federated learning (FL) emerges as a popular distributed learning schema...

Evolution and trade-off dynamics of functional load

Function Load (FL) quantifies the contributions by phonological contrast...