Fine-grained Private Knowledge Distillation

07/27/2022
by   Shaowei Wang, et al.
0

Knowledge distillation has emerged as a scalable and effective way for privacy-preserving machine learning. One remaining drawback is that it consumes privacy in a model-level (i.e., client-level) manner, every distillation query incurs privacy loss of one client's all records. In order to attain fine-grained privacy accountant and improve utility, this work proposes a model-free reverse k-NN labeling method towards record-level private knowledge distillation, where each record is employed for labeling at most k queries. Theoretically, we provide bounds of labeling error rate under the centralized/local/shuffle model of differential privacy (w.r.t. the number of records per query, privacy budgets). Experimentally, we demonstrate that it achieves new state-of-the-art accuracy with one order of magnitude lower of privacy loss. Specifically, on the CIFAR-10 dataset, it reaches 82.1% test accuracy with centralized privacy budget 1.0; on the MNIST/SVHN dataset, it reaches 99.1%/95.6% accuracy respectively with budget 0.1. It is the first time deep learning with differential privacy achieve comparable accuracy with reasonable data privacy protection (i.e., exp(ϵ)≤ 1.5). Our code is available at https://github.com/liyuntong9/rknn.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/11/2019

Privacy-Preserving Gradient Boosting Decision Trees

The Gradient Boosting Decision Tree (GBDT) is a popular machine learning...
research
04/05/2020

Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks

The deployment of deep learning applications has to address the growing ...
research
02/07/2022

Locally Differentially Private Distributed Deep Learning via Knowledge Distillation

Deep learning often requires a large amount of data. In real-world appli...
research
09/15/2023

Privacy-preserving Early Detection of Epileptic Seizures in Videos

In this work, we contribute towards the development of video-based epile...
research
11/13/2018

Private Model Compression via Knowledge Distillation

The soaring demand for intelligent mobile applications calls for deployi...
research
06/05/2019

Private Deep Learning with Teacher Ensembles

Privacy-preserving deep learning is crucial for deploying deep neural ne...

Please sign up or login with your details

Forgot password? Click here to reset