Mixture of basis for interpretable continual learning with distribution shifts

01/05/2022
by   Mengda Xu, et al.
9

Continual learning in environments with shifting data distributions is a challenging problem with several real-world applications. In this paper we consider settings in which the data distribution(task) shifts abruptly and the timing of these shifts are not known. Furthermore, we consider a semi-supervised task-agnostic setting in which the learning algorithm has access to both task-segmented and unsegmented data for offline training. We propose a novel approach called mixture of Basismodels (MoB) for addressing this problem setting. The core idea is to learn a small set of basis models and to construct a dynamic, task-dependent mixture of the models to predict for the current task. We also propose a new methodology to detect observations that are out-of-distribution with respect to the existing basis models and to instantiate new models as needed. We test our approach in multiple domains and show that it attains better prediction error than existing methods in most cases while using fewer models than other multiple model approaches. Moreover, we analyze the latent task representations learned by MoB and show that similar tasks tend to cluster in the latent space and that the latent representation shifts at the task boundaries when tasks are dissimilar.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 7

page 8

page 14

page 15

page 16

page 17

page 18

08/20/2021

Online Continual Learning with Natural Distribution Shifts: An Empirical Study with Visual Data

Continual learning is the problem of learning and retaining knowledge th...
02/22/2018

Unicorn: Continual Learning with a Universal, Off-policy Agent

Some real-world domains are best characterized as a single task, but for...
01/03/2020

A Neural Dirichlet Process Mixture Model for Task-Free Continual Learning

Despite the growing interest in continual learning, most of its contempo...
12/03/2021

Contrastive Continual Learning with Feature Propagation

Classical machine learners are designed only to tackle one task without ...
12/15/2020

Variational Beam Search for Online Learning with Distribution Shifts

We consider the problem of online learning in the presence of sudden dis...
02/04/2022

Discovering Distribution Shifts using Latent Space Representations

Rapid progress in representation learning has led to a proliferation of ...
10/17/2021

Growing Representation Learning

Machine learning continues to grow in popularity due to its ability to l...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.