Students Parrot Their Teachers: Membership Inference on Model Distillation

03/06/2023
by   Matthew Jagielski, et al.
0

Model distillation is frequently proposed as a technique to reduce the privacy leakage of machine learning. These empirical privacy defenses rely on the intuition that distilled “student” models protect the privacy of training data, as they only interact with this data indirectly through a “teacher” model. In this work, we design membership inference attacks to systematically study the privacy provided by knowledge distillation to both the teacher and student training sets. Our new attacks show that distillation alone provides only limited privacy across a number of domains. We explain the success of our attacks on distillation by showing that membership inference attacks on a private dataset can succeed even if the target model is *never* queried on any actual training points, but only on inputs whose predictions are highly influenced by training data. Finally, we show that our attacks are strongest when student and teacher sets are similar, or when the attacker can poison the teacher set.

READ FULL TEXT

page 2

page 15

research
12/16/2022

Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework

Knowledge distillation (KD) has been widely used for model compression a...
research
05/05/2023

A Comprehensive Study on Dataset Distillation: Performance, Privacy, Robustness and Fairness

The aim of dataset distillation is to encode the rich features of an ori...
research
11/02/2021

Knowledge Cross-Distillation for Membership Privacy

A membership inference attack (MIA) poses privacy risks on the training ...
research
10/27/2020

FaceLeaks: Inference Attacks against Transfer Learning Models via Black-box Queries

Transfer learning is a useful machine learning framework that allows one...
research
10/15/2021

Mitigating Membership Inference Attacks by Self-Distillation Through a Novel Ensemble Architecture

Membership inference attacks are a key measure to evaluate privacy leaka...
research
02/13/2021

Distilling Double Descent

Distillation is the technique of training a "student" model based on exa...
research
05/24/2020

Joint learning of interpretation and distillation

The extra trust brought by the model interpretation has made it an indis...

Please sign up or login with your details

Forgot password? Click here to reset