DeepAI
Log In Sign Up

Do Not Forget to Attend to Uncertainty while Mitigating Catastrophic Forgetting

02/03/2021
by   Vinod K. Kurmi, et al.
14

One of the major limitations of deep learning models is that they face catastrophic forgetting in an incremental learning scenario. There have been several approaches proposed to tackle the problem of incremental learning. Most of these methods are based on knowledge distillation and do not adequately utilize the information provided by older task models, such as uncertainty estimation in predictions. The predictive uncertainty provides the distributional information can be applied to mitigate catastrophic forgetting in a deep learning framework. In the proposed work, we consider a Bayesian formulation to obtain the data and model uncertainties. We also incorporate self-attention framework to address the incremental learning problem. We define distillation losses in terms of aleatoric uncertainty and self-attention. In the proposed work, we investigate different ablation analyses on these losses. Furthermore, we are able to obtain better results in terms of accuracy on standard benchmarks.

READ FULL TEXT
03/26/2022

Uncertainty-aware Contrastive Distillation for Incremental Semantic Segmentation

A fundamental and challenging problem in deep learning is catastrophic f...
06/15/2021

Bridge Networks

Despite rapid progress, current deep learning methods face a number of c...
10/26/2021

Response-based Distillation for Incremental Object Detection

Traditional object detection are ill-equipped for incremental learning. ...
03/25/2022

Class-Incremental Learning for Action Recognition in Videos

We tackle catastrophic forgetting problem in the context of class-increm...
03/06/2021

Semantic-aware Knowledge Distillation for Few-Shot Class-Incremental Learning

Few-shot class incremental learning (FSCIL) portrays the problem of lear...
09/01/2022

An Incremental Learning framework for Large-scale CTR Prediction

In this work we introduce an incremental learning framework for Click-Th...