Summarizing Stream Data for Memory-Restricted Online Continual Learning

05/26/2023
by   Jianyang Gu, et al.
0

Replay-based methods have proved their effectiveness on online continual learning by rehearsing past samples from an auxiliary memory. With many efforts made on improving training schemes based on the memory, however, the information carried by each sample in the memory remains under-investigated. Under circumstances with restricted storage space, the informativeness of the memory becomes critical for effective replay. Although some works design specific strategies to select representative samples, by only employing original images, the storage space is still not well utilized. To this end, we propose to Summarize the knowledge from the Stream Data (SSD) into more informative samples by distilling the training characteristics of real images. Through maintaining the consistency of training gradients and relationship to the past tasks, the summarized samples are more representative for the stream data compared to the original images. Extensive experiments are conducted on multiple online continual learning benchmarks to support that the proposed SSD method significantly enhances the replay effects. We demonstrate that with limited extra computational overhead, SSD provides more than 3 for sequential CIFAR-100 under extremely restricted memory buffer. The code is available in https://github.com/vimar-gu/SSD.

READ FULL TEXT

page 2

page 3

page 8

page 12

research
09/23/2022

Optimizing Class Distribution in Memory for Multi-Label Online Continual Learning

Online continual learning, especially when task identities and task boun...
research
10/12/2022

Improving information retention in large scale online continual learning

Given a stream of data sampled from non-stationary distributions, online...
research
08/11/2023

Cost-effective On-device Continual Learning over Memory Hierarchy with Miro

Continual learning (CL) trains NN models incrementally from a continuous...
research
06/14/2022

Learning towards Synchronous Network Memorizability and Generalizability for Continual Segmentation across Multiple Sites

In clinical practice, a segmentation network is often required to contin...
research
09/18/2023

CaT: Balanced Continual Graph Learning with Graph Condensation

Continual graph learning (CGL) is purposed to continuously update a grap...
research
02/16/2023

New Insights on Relieving Task-Recency Bias for Online Class Incremental Learning

To imitate the ability of keeping learning of human, continual learning ...
research
10/12/2022

On the Effectiveness of Lipschitz-Driven Rehearsal in Continual Learning

Rehearsal approaches enjoy immense popularity with Continual Learning (C...

Please sign up or login with your details

Forgot password? Click here to reset