Deep Learning with Gaussian Differential Privacy

11/26/2019
by   Zhiqi Bu, et al.
9

Deep learning models are often trained on datasets that contain sensitive information such as individuals' shopping transactions, personal contacts, and medical records. An increasingly important line of work therefore has sought to train neural networks subject to privacy constraints that are specified by differential privacy or its divergence-based relaxations. These privacy definitions, however, have weaknesses in handling certain important primitives (composition and subsampling), thereby giving loose or complicated privacy analyses of training neural networks. In this paper, we consider a recently proposed privacy definition termed f-differential privacy [17] for a refined privacy analysis of training neural networks. Leveraging the appealing properties of f-differential privacy in handling composition and subsampling, this paper derives analytically tractable expressions for the privacy guarantees of both stochastic gradient descent and Adam used in training deep neural networks, without the need of developing sophisticated techniques as [3] did. Our results demonstrate that the f-differential privacy framework allows for a new privacy analysis that improves on the prior analysis [3], which in turn suggests tuning certain parameters of neural networks for a better prediction accuracy without violating the privacy budget. These theoretically derived improvements are confirmed by our experiments in a range of tasks for image classification, text classification, and recommendation system.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/16/2020

A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences

We derive the optimal differential privacy (DP) parameters of a mechanis...
03/10/2020

Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion

Datasets containing sensitive information are often sequentially analyze...
12/04/2018

Privacy-Preserving Distributed Deep Learning for Clinical Data

Deep learning with medical data often requires larger samples sizes than...
02/11/2021

On Deep Learning with Label Differential Privacy

In many machine learning applications, the training data can contain hig...
08/25/2020

Individual Privacy Accounting via a Renyi Filter

We consider a sequential setting in which a single dataset of individual...
12/20/2020

Privacy Analysis of Online Learning Algorithms via Contraction Coefficients

We propose an information-theoretic technique for analyzing privacy guar...
09/18/2017

Adaptive Laplace Mechanism: Differential Privacy Preservation in Deep Learning

In this paper, we focus on developing a novel mechanism to preserve diff...

Code Repositories

Deep-Learning-with-GDP-Tensorflow

Code to accompany the paper "Deep Learning with Gaussian Differential Privacy"


view repo

Deep-Learning-with-GDP-Pytorch

Code to accompany the paper "Deep Learning with Gaussian Differential Privacy"


view repo

Deep-Learning-with-GDP

Code to accompany the paper "Deep Learning with Gaussian Differential Privacy"


view repo