Incremental Learning In Online Scenario

03/30/2020
by   Jiangpeng He, et al.
0

Modern deep learning approaches have achieved great success in many vision applications by training a model using all available task-specific data. However, there are two major obstacles making it challenging to implement for real life applications: (1) Learning new classes makes the trained model quickly forget old classes knowledge, which is referred to as catastrophic forgetting. (2) As new observations of old classes come sequentially over time, the distribution may change in unforeseen way, making the performance degrade dramatically on future data, which is referred to as concept drift. Current state-of-the-art incremental learning methods require a long time to train the model whenever new classes are added and none of them takes into consideration the new observations of old classes. In this paper, we propose an incremental learning framework that can work in the challenging online learning scenario and handle both new classes data and new observations of old classes. We address problem (1) in online mode by introducing a modified cross-distillation loss together with a two-step learning technique. Our method outperforms the results obtained from current state-of-the-art offline incremental learning methods on the CIFAR-100 and ImageNet-1000 (ILSVRC 2012) datasets under the same experiment protocol but in online scenario. We also provide a simple yet effective method to mitigate problem (2) by updating exemplar set using the feature of each new observation of old classes and demonstrate a real life application of online food image classification based on our complete framework using the Food-101 dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/25/2018

End-to-End Incremental Learning

Although deep learning approaches have stood out in recent years due to ...
research
05/30/2019

Large Scale Incremental Learning

Modern machine learning suffers from catastrophic forgetting when learni...
research
03/23/2021

Balanced Softmax Cross-Entropy for Incremental Learning

Deep neural networks are prone to catastrophic forgetting when increment...
research
12/31/2020

Incremental Embedding Learning via Zero-Shot Translation

Modern deep learning methods have achieved great success in machine lear...
research
08/15/2021

Online Continual Learning For Visual Food Classification

Food image classification is challenging for real-world applications sin...
research
01/12/2023

Online Class-Incremental Learning For Real-World Food Classification

Online Class-Incremental Learning (OCIL) aims to continuously learn new ...
research
04/21/2021

IB-DRR: Incremental Learning with Information-Back Discrete Representation Replay

Incremental learning aims to enable machine learning models to continuou...

Please sign up or login with your details

Forgot password? Click here to reset