Masked Autoencoders are Efficient Continual Federated Learners

06/06/2023
by   Subarnaduti Paul, et al.
7

Machine learning is typically framed from a perspective of i.i.d., and more importantly, isolated data. In parts, federated learning lifts this assumption, as it sets out to solve the real-world challenge of collaboratively learning a shared model from data distributed across clients. However, motivated primarily by privacy and computational constraints, the fact that data may change, distributions drift, or even tasks advance individually on clients, is seldom taken into account. The field of continual learning addresses this separate challenge and first steps have recently been taken to leverage synergies in distributed supervised settings, in which several clients learn to solve changing classification tasks over time without forgetting previously seen ones. Motivated by these prior works, we posit that such federated continual learning should be grounded in unsupervised learning of representations that are shared across clients; in the loose spirit of how humans can indirectly leverage others' experience without exposure to a specific task. For this purpose, we demonstrate that masked autoencoders for distribution estimation are particularly amenable to this setup. Specifically, their masking strategy can be seamlessly integrated with task attention mechanisms to enable selective knowledge transfer between clients. We empirically corroborate the latter statement through several continual federated scenarios on both image and binary datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2020

Federated Continual Learning with Adaptive Parameter Communication

There has been a surge of interest in continual learning and federated l...
research
10/12/2022

Federated Continual Learning for Text Classification via Selective Inter-client Transfer

In this work, we combine the two paradigms: Federated Learning (FL) and ...
research
04/07/2023

Asynchronous Federated Continual Learning

The standard class-incremental continual learning setting assumes a set ...
research
02/25/2023

Better Generative Replay for Continual Federated Learning

Federated learning is a technique that enables a centralized server to l...
research
07/10/2023

Fed-CPrompt: Contrastive Prompt for Rehearsal-Free Federated Continual Learning

Federated continual learning (FCL) learns incremental tasks over time fr...
research
09/01/2023

Jointly Exploring Client Drift and Catastrophic Forgetting in Dynamic Learning

Federated and Continual Learning have emerged as potential paradigms for...
research
10/26/2022

Federated Continual Learning to Detect Accounting Anomalies in Financial Auditing

The International Standards on Auditing require auditors to collect reas...

Please sign up or login with your details

Forgot password? Click here to reset