No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data

06/09/2021
by   Mi Luo, et al.
9

A central challenge in training classification models in the real-world federated system is learning with non-IID data. To cope with this, most of the existing works involve enforcing regularization in local optimization or improving the model aggregation scheme at the server. Other works also share public datasets or synthesized samples to supplement the training of under-represented classes or introduce a certain level of personalization. Though effective, they lack a deep understanding of how the data heterogeneity affects each layer of a deep classification model. In this paper, we bridge this gap by performing an experimental analysis of the representations learned by different layers. Our observations are surprising: (1) there exists a greater bias in the classifier than other layers, and (2) the classification performance can be significantly improved by post-calibrating the classifier after federated training. Motivated by the above findings, we propose a novel and simple algorithm called Classifier Calibration with Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated gaussian mixture model. Experimental results demonstrate that CCVR achieves state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10. We hope that our simple yet effective method can shed some light on the future research of federated learning with non-IID data.

READ FULL TEXT

page 15

page 18

research
08/17/2021

Aggregation Delayed Federated Learning

Federated learning is a distributed machine learning paradigm where mult...
research
03/30/2021

Model-Contrastive Federated Learning

Federated learning enables multiple parties to collaboratively train a m...
research
04/11/2023

Federated Learning with Classifier Shift for Class Imbalance

Federated learning aims to learn a global model collaboratively while th...
research
03/17/2023

No Fear of Classifier Biases: Neural Collapse Inspired Federated Learning with Synthetic and Fixed Classifier

Data heterogeneity is an inherent challenge that hinders the performance...
research
12/01/2021

Compare Where It Matters: Using Layer-Wise Regularization To Improve Federated Learning on Heterogeneous Data

Federated Learning is a widely adopted method to train neural networks o...
research
06/28/2021

Weight Divergence Driven Divide-and-Conquer Approach for Optimal Federated Learning from non-IID Data

Federated Learning allows training of data stored in distributed devices...
research
09/13/2023

Learning From Drift: Federated Learning on Non-IID Data via Drift Regularization

Federated learning algorithms perform reasonably well on independent and...

Please sign up or login with your details

Forgot password? Click here to reset