Differentially Private Deep Learning with Smooth Sensitivity

03/01/2020
by   Lichao Sun, et al.
14

Ensuring the privacy of sensitive data used to train modern machine learning models is of paramount importance in many areas of practice. One approach to study these concerns is through the lens of differential privacy. In this framework, privacy guarantees are generally obtained by perturbing models in such a way that specifics of data used to train the model are made ambiguous. A particular instance of this approach is through a "teacher-student" framework, wherein the teacher, who owns the sensitive data, provides the student with useful, but noisy, information, hopefully allowing the student model to perform well on a given task without access to particular features of the sensitive data. Because stronger privacy guarantees generally involve more significant perturbation on the part of the teacher, deploying existing frameworks fundamentally involves a trade-off between student's performance and privacy guarantee. One of the most important techniques used in previous works involves an ensemble of teacher models, which return information to a student based on a noisy voting procedure. In this work, we propose a novel voting mechanism with smooth sensitivity, which we call Immutable Noisy ArgMax, that, under certain conditions, can bear very large random noising from the teacher without affecting the useful information transferred to the student. Compared with previous work, our approach improves over the state-of-the-art methods on all measures, and scale to larger tasks with both better performance and stronger privacy (ϵ≈ 0). This new proposed framework can be applied with any machine learning models, and provides an appealing solution for tasks that requires training on a large amount of data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2019

Not Just Cloud Privacy: Protecting Client Privacy in Teacher-Student Learning

Ensuring the privacy of sensitive data used to train modern machine lear...
research
10/18/2016

Semi-supervised Knowledge Transfer for Deep Learning from Private Training Data

Some machine learning applications involve training data that is sensiti...
research
06/21/2019

Scalable Differentially Private Generative Student Model via PATE

Recent rapid development of machine learning is largely due to algorithm...
research
02/24/2018

Scalable Private Learning with PATE

The rapid adoption of machine learning has increased concerns about the ...
research
09/18/2021

Releasing Graph Neural Networks with Differential Privacy Guarantees

With the increasing popularity of Graph Neural Networks (GNNs) in severa...
research
07/04/2021

Towards Scheduling Federated Deep Learning using Meta-Gradients for Inter-Hospital Learning

Given the abundance and ease of access of personal data today, individua...
research
06/05/2019

Private Deep Learning with Teacher Ensembles

Privacy-preserving deep learning is crucial for deploying deep neural ne...

Please sign up or login with your details

Forgot password? Click here to reset