Carousel Memory: Rethinking the Design of Episodic Memory for Continual Learning

10/14/2021
by   Soobee Lee, et al.
41

Continual Learning (CL) is an emerging machine learning paradigm that aims to learn from a continuous stream of tasks without forgetting knowledge learned from the previous tasks. To avoid performance decrease caused by forgetting, prior studies exploit episodic memory (EM), which stores a subset of the past observed samples while learning from new non-i.i.d. data. Despite the promising results, since CL is often assumed to execute on mobile or IoT devices, the EM size is bounded by the small hardware memory capacity and makes it infeasible to meet the accuracy requirements for real-world applications. Specifically, all prior CL methods discard samples overflowed from the EM and can never retrieve them back for subsequent training steps, incurring loss of information that would exacerbate catastrophic forgetting. We explore a novel hierarchical EM management strategy to address the forgetting issue. In particular, in mobile and IoT devices, real-time data can be stored not just in high-speed RAMs but in internal storage devices as well, which offer significantly larger capacity than the RAMs. Based on this insight, we propose to exploit the abundant storage to preserve past experiences and alleviate the forgetting by allowing CL to efficiently migrate samples between memory and storage without being interfered by the slow access speed of the storage. We call it Carousel Memory (CarM). As CarM is complementary to existing CL methods, we conduct extensive evaluations of our method with seven popular CL methods and show that CarM significantly improves the accuracy of the methods across different settings by large margins in final average accuracy (up to 28.4 retaining the same training efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2020

Gradient Episodic Memory with a Soft Constraint for Continual Learning

Catastrophic forgetting in continual learning is a common destructive ph...
research
08/03/2023

Improving Replay Sample Selection and Storage for Less Forgetting in Continual Learning

Continual learning seeks to enable deep learners to train on a series of...
research
08/14/2021

Weakly Supervised Continual Learning

Continual Learning (CL) investigates how to train Deep Networks on a str...
research
07/14/2022

In-memory Realization of In-situ Few-shot Continual Learning with a Dynamically Evolving Explicit Memory

Continually learning new classes from a few training examples without fo...
research
01/13/2021

EEC: Learning to Encode and Regenerate Images for Continual Learning

The two main impediments to continual learning are catastrophic forgetti...
research
03/25/2021

Efficient Feature Transformations for Discriminative and Generative Continual Learning

As neural networks are increasingly being applied to real-world applicat...
research
06/22/2020

A sparse code for neuro-dynamic programming and optimal control

Sparse codes have been suggested to offer certain computational advantag...

Please sign up or login with your details

Forgot password? Click here to reset