Attaining Class-level Forgetting in Pretrained Model using Few Samples

10/19/2022
by   Pravendra Singh, et al.
4

In order to address real-world problems, deep learning models are jointly trained on many classes. However, in the future, some classes may become restricted due to privacy/ethical concerns, and the restricted class knowledge has to be removed from the models that have been trained on them. The available data may also be limited due to privacy/ethical concerns, and re-training the model will not be possible. We propose a novel approach to address this problem without affecting the model's prediction power for the remaining classes. Our approach identifies the model parameters that are highly relevant to the restricted classes and removes the knowledge regarding the restricted classes from them using the limited available training data. Our approach is significantly faster and performs similar to the model re-trained on the complete data of the remaining classes.

READ FULL TEXT
research
12/23/2021

DILF-EN framework for Class-Incremental Learning

Deep learning models suffer from catastrophic forgetting of the classes ...
research
06/12/2021

Knowledge Consolidation based Class Incremental Online Learning with Limited Data

We propose a novel approach for class incremental online learning in a l...
research
03/01/2021

Few-Shot Lifelong Learning

Many real-world classification problems often have classes with very few...
research
06/09/2023

One-Shot Machine Unlearning with Mnemonic Code

Deep learning has achieved significant improvements in accuracy and has ...
research
01/15/2021

Data Impressions: Mining Deep Models to Extract Samples for Data-free Applications

Pretrained deep models hold their learnt knowledge in the form of the mo...
research
07/23/2023

Geometry-Aware Adaptation for Pretrained Models

Machine learning models – including prominent zero-shot models – are oft...
research
10/22/2021

On the Necessity of Auditable Algorithmic Definitions for Machine Unlearning

Machine unlearning, i.e. having a model forget about some of its trainin...

Please sign up or login with your details

Forgot password? Click here to reset