DeepAI AI Chat
Log In Sign Up

Online Continual Learning on Class Incremental Blurry Task Configuration with Anytime Inference

by   Hyunseo Koh, et al.
Gwangju Institute of Science and Technology

Despite rapid advances in continual learning, a large body of research is devoted to improving performance in the existing setups. While a handful of work do propose new continual learning setups, they still lack practicality in certain aspects. For better practicality, we first propose a novel continual learning setup that is online, task-free, class-incremental, of blurry task boundaries and subject to inference queries at any moment. We additionally propose a new metric to better measure the performance of the continual learning methods subject to inference queries at any moment. To address the challenging setup and evaluation protocol, we propose an effective method that employs a new memory management scheme and novel learning techniques. Our empirical validation demonstrates that the proposed method outperforms prior arts by large margins.


page 1

page 2

page 3

page 4


Taxonomic Class Incremental Learning

The problem of continual learning has attracted rising attention in rece...

Fine-Grained Continual Learning

Robotic vision is a field where continual learning can play a significan...

Task-Free Continual Learning

Methods proposed in the literature towards continual deep learning typic...

Online Continual Learning on Hierarchical Label Expansion

Continual learning (CL) enables models to adapt to new tasks and environ...

Heterogeneous Continual Learning

We propose a novel framework and a solution to tackle the continual lear...

Isolation and Impartial Aggregation: A Paradigm of Incremental Learning without Interference

This paper focuses on the prevalent performance imbalance in the stages ...

Coresets via Bilevel Optimization for Continual Learning and Streaming

Coresets are small data summaries that are sufficient for model training...