Deep Learning with Gaussian Differential Privacy

11/26/2019
by   Zhiqi Bu, et al.
9

Deep learning models are often trained on datasets that contain sensitive information such as individuals' shopping transactions, personal contacts, and medical records. An increasingly important line of work therefore has sought to train neural networks subject to privacy constraints that are specified by differential privacy or its divergence-based relaxations. These privacy definitions, however, have weaknesses in handling certain important primitives (composition and subsampling), thereby giving loose or complicated privacy analyses of training neural networks. In this paper, we consider a recently proposed privacy definition termed f-differential privacy [17] for a refined privacy analysis of training neural networks. Leveraging the appealing properties of f-differential privacy in handling composition and subsampling, this paper derives analytically tractable expressions for the privacy guarantees of both stochastic gradient descent and Adam used in training deep neural networks, without the need of developing sophisticated techniques as [3] did. Our results demonstrate that the f-differential privacy framework allows for a new privacy analysis that improves on the prior analysis [3], which in turn suggests tuning certain parameters of neural networks for a better prediction accuracy without violating the privacy budget. These theoretically derived improvements are confirmed by our experiments in a range of tasks for image classification, text classification, and recommendation system.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2020

A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences

We derive the optimal differential privacy (DP) parameters of a mechanis...
research
03/10/2020

Sharp Composition Bounds for Gaussian Differential Privacy via Edgeworth Expansion

Datasets containing sensitive information are often sequentially analyze...
research
12/04/2018

Privacy-Preserving Distributed Deep Learning for Clinical Data

Deep learning with medical data often requires larger samples sizes than...
research
07/01/2016

Deep Learning with Differential Privacy

Machine learning techniques based on neural networks are achieving remar...
research
02/11/2021

On Deep Learning with Label Differential Privacy

In many machine learning applications, the training data can contain hig...
research
09/18/2017

Adaptive Laplace Mechanism: Differential Privacy Preservation in Deep Learning

In this paper, we focus on developing a novel mechanism to preserve diff...
research
08/25/2020

Individual Privacy Accounting via a Renyi Filter

We consider a sequential setting in which a single dataset of individual...

Please sign up or login with your details

Forgot password? Click here to reset