Jointly Exploring Client Drift and Catastrophic Forgetting in Dynamic Learning

09/01/2023
by   Niklas Babendererde, et al.
0

Federated and Continual Learning have emerged as potential paradigms for the robust and privacy-aware use of Deep Learning in dynamic environments. However, Client Drift and Catastrophic Forgetting are fundamental obstacles to guaranteeing consistent performance. Existing work only addresses these problems separately, which neglects the fact that the root cause behind both forms of performance deterioration is connected. We propose a unified analysis framework for building a controlled test environment for Client Drift – by perturbing a defined ratio of clients – and Catastrophic Forgetting – by shifting all clients with a particular strength. Our framework further leverages this new combined analysis by generating a 3D landscape of the combined performance impact from both. We demonstrate that the performance drop through Client Drift, caused by a certain share of shifted clients, is correlated to the drop from Catastrophic Forgetting resulting from a corresponding shift strength. Correlation tests between both problems for Computer Vision (CelebA) and Medical Imaging (PESO) support this new perspective, with an average Pearson rank correlation coefficient of over 0.94. Our framework's novel ability of combined spatio-temporal shift analysis allows us to investigate how both forms of distribution shift behave in mixed scenarios, opening a new pathway for better generalization. We show that a combination of moderate Client Drift and Catastrophic Forgetting can even improve the performance of the resulting model (causing a "Generalization Bump") compared to when only one of the shifts occurs individually. We apply a simple and commonly used method from Continual Learning in the federated setting and observe this phenomenon to be reoccurring, leveraging the ability of our framework to analyze existing and novel methods for Federated and Continual Learning.

READ FULL TEXT
research
03/24/2022

Addressing Client Drift in Federated Continual Learning with Adaptive Optimization

Federated learning has been extensively studied and is the prevalent met...
research
07/17/2022

Federated Continual Learning through distillation in pervasive computing

Federated Learning has been introduced as a new machine learning paradig...
research
09/01/2021

Federated Reconnaissance: Efficient, Distributed, Class-Incremental Learning

We describe federated reconnaissance, a class of learning problems in wh...
research
09/28/2021

Formalizing the Generalization-Forgetting Trade-off in Continual Learning

We formulate the continual learning (CL) problem via dynamic programming...
research
06/06/2023

Masked Autoencoders are Efficient Continual Federated Learners

Machine learning is typically framed from a perspective of i.i.d., and m...
research
07/02/2023

Don't Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory

Deep learning models are prone to forgetting information learned in the ...

Please sign up or login with your details

Forgot password? Click here to reset