Federated Orthogonal Training: Mitigating Global Catastrophic Forgetting in Continual Federated Learning

09/03/2023
by   Yavuz Faruk Bakman, et al.
0

Federated Learning (FL) has gained significant attraction due to its ability to enable privacy-preserving training over decentralized data. Current literature in FL mostly focuses on single-task learning. However, over time, new tasks may appear in the clients and the global model should learn these tasks without forgetting previous tasks. This real-world scenario is known as Continual Federated Learning (CFL). The main challenge of CFL is Global Catastrophic Forgetting, which corresponds to the fact that when the global model is trained on new tasks, its performance on old tasks decreases. There have been a few recent works on CFL to propose methods that aim to address the global catastrophic forgetting problem. However, these works either have unrealistic assumptions on the availability of past data samples or violate the privacy principles of FL. We propose a novel method, Federated Orthogonal Training (FOT), to overcome these drawbacks and address the global catastrophic forgetting in CFL. Our algorithm extracts the global input subspace of each layer for old tasks and modifies the aggregated updates of new tasks such that they are orthogonal to the global principal subspace of old tasks for each layer. This decreases the interference between tasks, which is the main cause for forgetting. We empirically show that FOT outperforms state-of-the-art continual learning methods in the CFL setting, achieving an average accuracy gain of up to 15 computation and communication cost.

READ FULL TEXT
research
03/22/2022

Federated Class-Incremental Learning

Federated learning (FL) has attracted growing attention via data-private...
research
07/02/2023

Don't Memorize; Mimic The Past: Federated Class Incremental Learning Without Episodic Memory

Deep learning models are prone to forgetting information learned in the ...
research
02/02/2023

No One Left Behind: Real-World Federated Class-Incremental Learning

Federated learning (FL) is a hot collaborative training framework via ag...
research
06/27/2023

FedET: A Communication-Efficient Federated Class-Incremental Learning Framework Based on Enhanced Transformer

Federated Learning (FL) has been widely concerned for it enables decentr...
research
07/16/2023

A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning

Forgetting refers to the loss or deterioration of previously acquired in...
research
07/07/2023

Federated Unlearning via Active Forgetting

The increasing concerns regarding the privacy of machine learning models...
research
02/21/2022

BERT WEAVER: Using WEight AVERaging to Enable Lifelong Learning for Transformer-based Models

Recent developments in transfer learning have boosted the advancements i...

Please sign up or login with your details

Forgot password? Click here to reset