DeepAI AI Chat
Log In Sign Up

FedCD: Improving Performance in non-IID Federated Learning

06/17/2020
by   Kavya Kopparapu, et al.
Harvard University
0

Federated learning has been widely applied to enable decentralized devices, which each have their own local data, to learn a shared model. However, learning from real-world data can be challenging, as it is rarely identically and independently distributed (IID) across edge devices (a key assumption for current high-performing and low-bandwidth algorithms). We present a novel approach, FedCD, which clones and deletes models to dynamically group devices with similar data. Experiments on the CIFAR-10 dataset show that FedCD achieves higher accuracy and faster convergence compared to a FedAvg baseline on non-IID data while incurring minimal computation, communication, and storage overheads.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/02/2018

Federated Learning with Non-IID Data

Federated learning enables resource-constrained edge compute devices, su...
06/19/2020

FedFMC: Sequential Efficient Federated Learning on Non-iid Data

As a mechanism for devices to update a global model without sharing data...
12/10/2020

Analysis and Optimal Edge Assignment For Hierarchical Federated Learning on Non-IID Data

Distributed learning algorithms aim to leverage distributed and diverse ...
07/22/2020

FedOCR: Communication-Efficient Federated Learning for Scene Text Recognition

While scene text recognition techniques have been widely used in commerc...
09/13/2019

Measuring the Effects of Non-Identical Data Distribution for Federated Visual Classification

Federated Learning enables visual models to be trained in a privacy-pres...
04/28/2022

Personalized Federated Learning with Multiple Known Clusters

We consider the problem of personalized federated learning when there ar...