SIESTA: Efficient Online Continual Learning with Sleep

03/19/2023
by   Md Yousuf Harun, et al.
0

In supervised continual learning, a deep neural network (DNN) is updated with an ever-growing data stream. Unlike the offline setting where data is shuffled, we cannot make any distributional assumptions about the data stream. Ideally, only one pass through the dataset is needed for computational efficiency. However, existing methods are inadequate and make many assumptions that cannot be made for real-world applications, while simultaneously failing to improve computational efficiency. In this paper, we do not propose a novel method. Instead, we present SIESTA, an incremental improvement to the continual learning algorithm REMIND. Unlike REMIND, SIESTA uses a wake/sleep framework for training, which is well aligned to the needs of on-device learning. SIESTA is far more computationally efficient than existing methods, enabling continual learning on ImageNet-1K in under 3 hours on a single GPU; moreover, in the augmentation-free setting it matches the performance of the offline learner, a milestone critical to driving adoption of continual learning in real-world applications.

READ FULL TEXT
research
06/02/2023

Overcoming the Stability Gap in Continual Learning

In many real-world applications, deep neural networks are retrained from...
research
11/03/2021

One Pass ImageNet

We present the One Pass ImageNet (OPIN) problem, which aims to study the...
research
02/02/2023

Avalanche: A PyTorch Library for Deep Continual Learning

Continual learning is the problem of learning from a nonstationary strea...
research
01/05/2022

Mixture of basis for interpretable continual learning with distribution shifts

Continual learning in environments with shifting data distributions is a...
research
03/24/2022

Tackling Online One-Class Incremental Learning by Removing Negative Contrasts

Recent work studies the supervised online continual learning setting whe...
research
06/06/2020

Coresets via Bilevel Optimization for Continual Learning and Streaming

Coresets are small data summaries that are sufficient for model training...
research
02/14/2022

Design of Explainability Module with Experts in the Loop for Visualization and Dynamic Adjustment of Continual Learning

Continual learning can enable neural networks to evolve by learning new ...

Please sign up or login with your details

Forgot password? Click here to reset