DeepAI AI Chat
Log In Sign Up

FedCD: Improving Performance in non-IID Federated Learning

by   Kavya Kopparapu, et al.
Harvard University

Federated learning has been widely applied to enable decentralized devices, which each have their own local data, to learn a shared model. However, learning from real-world data can be challenging, as it is rarely identically and independently distributed (IID) across edge devices (a key assumption for current high-performing and low-bandwidth algorithms). We present a novel approach, FedCD, which clones and deletes models to dynamically group devices with similar data. Experiments on the CIFAR-10 dataset show that FedCD achieves higher accuracy and faster convergence compared to a FedAvg baseline on non-IID data while incurring minimal computation, communication, and storage overheads.


page 1

page 2

page 3

page 4


Federated Learning with Non-IID Data

Federated learning enables resource-constrained edge compute devices, su...

FedFMC: Sequential Efficient Federated Learning on Non-iid Data

As a mechanism for devices to update a global model without sharing data...

Analysis and Optimal Edge Assignment For Hierarchical Federated Learning on Non-IID Data

Distributed learning algorithms aim to leverage distributed and diverse ...

FedOCR: Communication-Efficient Federated Learning for Scene Text Recognition

While scene text recognition techniques have been widely used in commerc...

Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification

Federated Learning enables visual models to be trained in a privacy-pres...

Personalized Federated Learning with Multiple Known Clusters

We consider the problem of personalized federated learning when there ar...