Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Neural Networks

11/12/2019
by   Aditya Golatkar, et al.
20

We explore the problem of selectively forgetting a particular set of data used for training a deep neural network. While the effects of the data to be forgotten can be hidden from the output of the network, insights may still be gleaned by probing deep into its weights. We propose a method for “scrubbing” the weights clean of information about a particular set of training data. The method does not require retraining from scratch, nor access to the data originally used for training. Instead, the weights are modified so that any probing function of the weights, computed with no knowledge of the random seed used for training, is indistinguishable from the same function applied to the weights of a network trained without the data to be forgotten. This condition is weaker than Differential Privacy, which seeks protection against adversaries that have access to the entire training process, and is more appropriate for deep learning, where a potential adversary might have access to the trained network, but generally, have no knowledge of how it was trained.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2019

Eternal Sunshine of the Spotless Net: Selective Forgetting in Deep Networks

We explore the problem of selectively forgetting a particular set of dat...
research
10/22/2021

Federated Unlearning via Class-Discriminative Pruning

We explore the problem of selectively forgetting categories from trained...
research
12/24/2020

Mixed-Privacy Forgetting in Deep Networks

We show that the influence of a subset of the training samples can be re...
research
09/07/2021

Trojan Signatures in DNN Weights

Deep neural networks have been shown to be vulnerable to backdoor, or tr...
research
03/29/2018

Protection against Cloning for Deep Learning

The susceptibility of deep learning to adversarial attack can be underst...
research
12/24/2019

TRADI: Tracking deep neural network weight distributions

During training, the weights of a Deep Neural Network (DNN) are optimize...
research
06/16/2016

On the Expressive Power of Deep Neural Networks

We propose a new approach to the problem of neural network expressivity,...

Please sign up or login with your details

Forgot password? Click here to reset