Incremental Learning with Unlabeled Data in the Wild

03/29/2019
by   Kibok Lee, et al.
13

Deep neural networks are known to suffer from catastrophic forgetting in class-incremental learning, where the performance on previous tasks drastically degrades when learning a new task. To alleviate this effect, we propose to leverage a continuous and large stream of unlabeled data in the wild. In particular, to leverage such transient external data effectively, we design a novel class-incremental learning scheme with (a) a new distillation loss, termed global distillation, (b) a learning strategy to avoid overfitting to the most recent task, and (c) a sampling strategy for the desired external data. Our experimental results on various datasets, including CIFAR and ImageNet, demonstrate the superiority of the proposed methods over prior methods, particularly when a stream of unlabeled data is accessible: we achieve up to 9.3 method.

READ FULL TEXT

page 1

page 10

research
06/15/2022

Queried Unlabeled Data Improves and Robustifies Class-Incremental Learning

Class-incremental learning (CIL) suffers from the notorious dilemma betw...
research
06/30/2023

Class-Incremental Learning using Diffusion Model for Distillation and Replay

Class-incremental learning aims to learn new classes in an incremental f...
research
04/19/2022

Learning to Imagine: Diversify Memory for Incremental Learning using Unlabeled Data

Deep neural network (DNN) suffers from catastrophic forgetting when lear...
research
05/02/2017

A Strategy for an Uncompromising Incremental Learner

Multi-class supervised learning systems require the knowledge of the ent...
research
04/10/2022

FOSTER: Feature Boosting and Compression for Class-Incremental Learning

The ability to learn new concepts continually is necessary in this ever-...
research
11/23/2022

Semi-Supervised Lifelong Language Learning

Lifelong learning aims to accumulate knowledge and alleviate catastrophi...
research
03/02/2021

Distilling Causal Effect of Data in Class-Incremental Learning

We propose a causal framework to explain the catastrophic forgetting in ...

Please sign up or login with your details

Forgot password? Click here to reset