Overcoming Forgetting in Federated Learning on Non-IID Data

10/17/2019
by   Neta Shoham, et al.
0

We tackle the problem of Federated Learning in the non i.i.d. case, in which local models drift apart, inhibiting learning. Building on an analogy with Lifelong Learning, we adapt a solution for catastrophic forgetting to Federated Learning. We add a penalty term to the loss function, compelling all local models to converge to a shared optimum. We show that this can be done efficiently for communication (adding no further privacy risks), scaling with the number of nodes in the distributed setting. Our experiments show that this method is superior to competing ones for image recognition on the MNIST dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2022

Exact Penalty Method for Federated Learning

Federated learning has burgeoned recently in machine learning, giving ri...
research
09/13/2023

Learning From Drift: Federated Learning on Non-IID Data via Drift Regularization

Federated learning algorithms perform reasonably well on independent and...
research
07/07/2023

Federated Unlearning via Active Forgetting

The increasing concerns regarding the privacy of machine learning models...
research
07/17/2022

Federated Learning and catastrophic forgetting in pervasive computing: demonstration in HAR domain

Federated Learning has been introduced as a new machine learning paradig...
research
11/22/2021

FLIX: A Simple and Communication-Efficient Alternative to Local Methods in Federated Learning

Federated Learning (FL) is an increasingly popular machine learning para...
research
11/01/2019

Robust Federated Learning with Noisy Communication

Federated learning is a communication-efficient training process that al...
research
01/06/2020

Think Locally, Act Globally: Federated Learning with Local and Global Representations

Federated learning is an emerging research paradigm to train models on p...

Please sign up or login with your details

Forgot password? Click here to reset