Compare Where It Matters: Using Layer-Wise Regularization To Improve Federated Learning on Heterogeneous Data

12/01/2021
by   Ha Min Son, et al.
0

Federated Learning is a widely adopted method to train neural networks over distributed data. One main limitation is the performance degradation that occurs when data is heterogeneously distributed. While many works have attempted to address this problem, these methods under-perform because they are founded on a limited understanding of neural networks. In this work, we verify that only certain important layers in a neural network require regularization for effective training. We additionally verify that Centered Kernel Alignment (CKA) most accurately calculates similarity between layers of neural networks trained on different data. By applying CKA-based regularization to important layers during training, we significantly improve performance in heterogeneous settings. We present FedCKA: a simple framework that out-performs previous state-of-the-art methods on various deep learning tasks while also improving efficiency and scalability.

READ FULL TEXT
research
07/07/2022

FedHeN: Federated Learning in Heterogeneous Networks

We propose a novel training recipe for federated learning with heterogen...
research
09/13/2023

Learning From Drift: Federated Learning on Non-IID Data via Drift Regularization

Federated learning algorithms perform reasonably well on independent and...
research
10/25/2022

FedClassAvg: Local Representation Learning for Personalized Federated Learning on Heterogeneous Neural Networks

Personalized federated learning is aimed at allowing numerous clients to...
research
06/09/2021

No Fear of Heterogeneity: Classifier Calibration for Federated Learning with Non-IID Data

A central challenge in training classification models in the real-world ...
research
08/17/2020

Siloed Federated Learning for Multi-Centric Histopathology Datasets

While federated learning is a promising approach for training deep learn...
research
07/13/2022

TCT: Convexifying Federated Learning using Bootstrapped Neural Tangent Kernels

State-of-the-art federated learning methods can perform far worse than t...
research
10/08/2021

Exploring Heterogeneous Characteristics of Layers in ASR Models for More Efficient Training

Transformer-based architectures have been the subject of research aimed ...

Please sign up or login with your details

Forgot password? Click here to reset