Multiple Modes for Continual Learning

09/29/2022
by   Siddhartha Datta, et al.
60

Adapting model parameters to incoming streams of data is a crucial factor to deep learning scalability. Interestingly, prior continual learning strategies in online settings inadvertently anchor their updated parameters to a local parameter subspace to remember old tasks, else drift away from the subspace and forget. From this observation, we formulate a trade-off between constructing multiple parameter modes and allocating tasks per mode. Mode-Optimized Task Allocation (MOTA), our contributed adaptation strategy, trains multiple modes in parallel, then optimizes task allocation per mode. We empirically demonstrate improvements over baseline continual learning strategies and across varying distribution shifts, namely sub-population, domain, and task shift.

READ FULL TEXT
research
03/02/2022

Continual Learning of Multi-modal Dynamics with External Memory

We study the problem of fitting a model to a dynamical environment when ...
research
05/19/2022

Interpolating Compressed Parameter Subspaces

Inspired by recent work on neural subspaces and mode connectivity, we re...
research
07/13/2022

CoSCL: Cooperation of Small Continual Learners is Stronger than a Big One

Continual learning requires incremental compatibility with a sequence of...
research
08/05/2022

Task-agnostic Continual Hippocampus Segmentation for Smooth Population Shifts

Most continual learning methods are validated in settings where task bou...
research
12/13/2020

Monitoring multimode processes: a modified PCA algorithm with continual learning ability

For multimode processes, one has to establish local monitoring models co...
research
02/23/2022

Continual learning-based probabilistic slow feature analysis for multimode dynamic process monitoring

In this paper, a novel multimode dynamic process monitoring approach is ...
research
10/10/2022

Tracking changes using Kullback-Leibler divergence for the continual learning

Recently, continual learning has received a lot of attention. One of the...

Please sign up or login with your details

Forgot password? Click here to reset