Online Continual Learning Without the Storage Constraint

05/16/2023
by   Ameya Prabhu, et al.
0

Online continual learning (OCL) research has primarily focused on mitigating catastrophic forgetting with fixed and limited storage allocation throughout the agent's lifetime. However, the growing affordability of data storage highlights a broad range of applications that do not adhere to these assumptions. In these cases, the primary concern lies in managing computational expenditures rather than storage. In this paper, we target such settings, investigating the online continual learning problem by relaxing storage constraints and emphasizing fixed, limited economical budget. We provide a simple algorithm that can compactly store and utilize the entirety of the incoming data stream under tiny computational budgets using a kNN classifier and universal pre-trained feature extractors. Our algorithm provides a consistency property attractive to continual learning: It will never forget past seen data. We set a new state of the art on two large-scale OCL datasets: Continual LOCalization (CLOC), which has 39M images over 712 classes, and Continual Google Landmarks V2 (CGLM), which has 580K images over 10,788 classes – beating methods under far higher computational budgets than ours in terms of both reducing catastrophic forgetting of past data and quickly adapting to rapidly changing data streams. We provide code to reproduce our results at <https://github.com/drimpossible/ACM>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2023

How Efficient Are Today's Continual Learning Algorithms?

Supervised Continual learning involves updating a deep neural network (D...
research
03/30/2022

Continual Normalization: Rethinking Batch Normalization for Online Continual Learning

Existing continual learning methods use Batch Normalization (BN) to faci...
research
03/20/2023

Computationally Budgeted Continual Learning: What Does Matter?

Continual Learning (CL) aims to sequentially train models on streams of ...
research
01/13/2021

EEC: Learning to Encode and Regenerate Images for Continual Learning

The two main impediments to continual learning are catastrophic forgetti...
research
05/16/2023

Rapid Adaptation in Online Continual Learning: Are We Evaluating It Right?

We revisit the common practice of evaluating adaptation of Online Contin...
research
02/28/2021

Towards Continual, Online, Unsupervised Depth

Although depth extraction with passive sensors has seen remarkable impro...
research
07/07/2020

Continual BERT: Continual Learning for Adaptive Extractive Summarization of COVID-19 Literature

The scientific community continues to publish an overwhelming amount of ...

Please sign up or login with your details

Forgot password? Click here to reset