Client Adaptation improves Federated Learning with Simulated Non-IID Clients

07/09/2020
by   Laura Rieger, et al.
0

We present a federated learning approach for learning a client adaptable, robust model when data is non-identically and non-independently distributed (non-IID) across clients. By simulating heterogeneous clients, we show that adding learned client-specific conditioning improves model performance, and the approach is shown to work on balanced and imbalanced data set from both audio and image domains. The client adaptation is implemented by a conditional gated activation unit and is particularly beneficial when there are large differences between the data distribution for each client, a common scenario in federated learning.

READ FULL TEXT
research
02/18/2022

PerFED-GAN: Personalized Federated Learning via Generative Adversarial Networks

Federated learning is gaining popularity as a distributed machine learni...
research
06/04/2021

FedCCEA : A Practical Approach of Client Contribution Evaluation for Federated Learning

Client contribution evaluation, also known as data valuation, is a cruci...
research
06/15/2022

Bayesian Federated Learning via Predictive Distribution Distillation

For most existing federated learning algorithms, each round consists of ...
research
01/25/2023

When to Trust Aggregated Gradients: Addressing Negative Client Sampling in Federated Learning

Federated Learning has become a widely-used framework which allows learn...
research
10/27/2021

What Do We Mean by Generalization in Federated Learning?

Federated learning data is drawn from a distribution of distributions: c...
research
11/25/2022

Federated Graph-based Sampling with Arbitrary Client Availability

While federated learning has shown strong results in optimizing a machin...
research
04/20/2022

Is Non-IID Data a Threat in Federated Online Learning to Rank?

In this perspective paper we study the effect of non independent and ide...

Please sign up or login with your details

Forgot password? Click here to reset