Learning an evolved mixture model for task-free continual learning

07/11/2022
by   Fei Ye, et al.
0

Recently, continual learning (CL) has gained significant interest because it enables deep learning models to acquire new knowledge without forgetting previously learnt information. However, most existing works require knowing the task identities and boundaries, which is not realistic in a real context. In this paper, we address a more challenging and realistic setting in CL, namely the Task-Free Continual Learning (TFCL) in which a model is trained on non-stationary data streams with no explicit task information. To address TFCL, we introduce an evolved mixture model whose network architecture is dynamically expanded to adapt to the data distribution shift. We implement this expansion mechanism by evaluating the probability distance between the knowledge stored in each mixture model component and the current memory buffer using the Hilbert Schmidt Independence Criterion (HSIC). We further introduce two simple dropout mechanisms to selectively remove stored examples in order to avoid memory overload while preserving memory diversity. Empirical results demonstrate that the proposed approach achieves excellent performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

Task-Free Continual Learning via Online Discrepancy Distance Learning

Learning from non-stationary data streams, also called Task-Free Continu...
research
07/15/2022

Improving Task-free Continual Learning by Distributionally Robust Memory Evolution

Task-free continual learning (CL) aims to learn a non-stationary data st...
research
01/03/2020

A Neural Dirichlet Process Mixture Model for Task-Free Continual Learning

Despite the growing interest in continual learning, most of its contempo...
research
11/30/2022

Continual Learning with Optimal Transport based Mixture Model

Online Class Incremental learning (CIL) is a challenging setting in Cont...
research
08/25/2021

Lifelong Infinite Mixture Model Based on Knowledge-Driven Dirichlet Process

Recent research efforts in lifelong learning propose to grow a mixture o...
research
07/09/2023

Class-Incremental Mixture of Gaussians for Deep Continual Learning

Continual learning models for stationary data focus on learning and reta...
research
08/20/2021

Continual Learning for Image-Based Camera Localization

For several emerging technologies such as augmented reality, autonomous ...

Please sign up or login with your details

Forgot password? Click here to reset