TKIL: Tangent Kernel Approach for Class Balanced Incremental Learning

06/17/2022
by   Jinlin Xiang, et al.
0

When learning new tasks in a sequential manner, deep neural networks tend to forget tasks that they previously learned, a phenomenon called catastrophic forgetting. Class incremental learning methods aim to address this problem by keeping a memory of a few exemplars from previously learned tasks, and distilling knowledge from them. However, existing methods struggle to balance the performance across classes since they typically overfit the model to the latest task. In our work, we propose to address these challenges with the introduction of a novel methodology of Tangent Kernel for Incremental Learning (TKIL) that achieves class-balanced performance. The approach preserves the representations across classes and balances the accuracy for each class, and as such achieves better overall accuracy and variance. TKIL approach is based on Neural Tangent Kernel (NTK), which describes the convergence behavior of neural networks as a kernel function in the limit of infinite width. In TKIL, the gradients between feature layers are treated as the distance between the representations of these layers and can be defined as Gradients Tangent Kernel loss (GTK loss) such that it is minimized along with averaging weights. This allows TKIL to automatically identify the task and to quickly adapt to it during inference. Experiments on CIFAR-100 and ImageNet datasets with various incremental learning settings show that these strategies allow TKIL to outperform existing state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2021

Essentials for Class Incremental Learning

Contemporary neural networks are limited in their ability to learn from ...
research
03/23/2021

Balanced Softmax Cross-Entropy for Incremental Learning

Deep neural networks are prone to catastrophic forgetting when increment...
research
03/28/2022

Energy-based Latent Aligner for Incremental Learning

Deep learning models tend to forget their earlier knowledge while increm...
research
11/24/2022

Neural Weight Search for Scalable Task Incremental Learning

Task incremental learning aims to enable a system to maintain its perfor...
research
06/12/2020

BI-MAML: Balanced Incremental Approach for Meta Learning

We present a novel Balanced Incremental Model Agnostic Meta Learning sys...
research
07/11/2023

MoP-CLIP: A Mixture of Prompt-Tuned CLIP Models for Domain Incremental Learning

Despite the recent progress in incremental learning, addressing catastro...

Please sign up or login with your details

Forgot password? Click here to reset