Private Deep Learning with Teacher Ensembles

06/05/2019
by   Lichao Sun, et al.
0

Privacy-preserving deep learning is crucial for deploying deep neural network based solutions, especially when the model works on data that contains sensitive information. Most privacy-preserving methods lead to undesirable performance degradation. Ensemble learning is an effective way to improve model performance. In this work, we propose a new method for teacher ensembles that uses more informative network outputs under differential private stochastic gradient descent and provide provable privacy guarantees. Out method employs knowledge distillation and hint learning on intermediate representations to facilitate the training of student model. Additionally, we propose a simple weighted ensemble scheme that works more robustly across different teaching settings. Experimental results on three common image datasets benchmark (i.e., CIFAR10, MINST, and SVHN) demonstrate that our approach outperforms previous state-of-the-art methods on both performance and privacy-budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2020

Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks

The deployment of deep learning applications has to address the growing ...
research
02/07/2022

Locally Differentially Private Distributed Deep Learning via Knowledge Distillation

Deep learning often requires a large amount of data. In real-world appli...
research
12/16/2022

Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework

Knowledge distillation (KD) has been widely used for model compression a...
research
01/30/2023

Private Node Selection in Personalized Decentralized Learning

In this paper, we propose a novel approach for privacy-preserving node s...
research
04/02/2021

PATE-AAE: Incorporating Adversarial Autoencoder into Private Aggregation of Teacher Ensembles for Spoken Command Classification

We propose using an adversarial autoencoder (AAE) to replace generative ...
research
03/01/2020

Differentially Private Deep Learning with Smooth Sensitivity

Ensuring the privacy of sensitive data used to train modern machine lear...
research
07/27/2022

Fine-grained Private Knowledge Distillation

Knowledge distillation has emerged as a scalable and effective way for p...

Please sign up or login with your details

Forgot password? Click here to reset