Do Not Let Privacy Overbill Utility: Gradient Embedding Perturbation for Private Learning

02/25/2021
by   Da Yu, et al.
0

The privacy leakage of the model about the training data can be bounded in the differential privacy mechanism. However, for meaningful privacy parameters, a differentially private model degrades the utility drastically when the model comprises a large number of trainable parameters. In this paper, we propose an algorithm Gradient Embedding Perturbation (GEP) towards training differentially private deep models with decent accuracy. Specifically, in each gradient descent step, GEP first projects individual private gradient into a non-sensitive anchor subspace, producing a low-dimensional gradient embedding and a small-norm residual gradient. Then, GEP perturbs the low-dimensional embedding and the residual gradient separately according to the privacy budget. Such a decomposition permits a small perturbation variance, which greatly helps to break the dimensional barrier of private learning. With GEP, we achieve decent accuracy with reasonable computational cost and modest privacy guarantee for deep models. Especially, with privacy bound ϵ=8, we achieve 74.9% test accuracy on CIFAR10 and 95.1% test accuracy on SVHN, significantly improving over existing results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2022

Differentially Private Deep Learning with ModelMix

Training large neural networks with meaningful/usable differential priva...
research
06/17/2021

Large Scale Private Learning via Low-rank Reparametrization

We propose a reparametrization scheme to address the challenges of apply...
research
12/25/2021

Gradient Leakage Attack Resilient Deep Learning

Gradient leakage attacks are considered one of the wickedest privacy thr...
research
04/03/2022

A Differentially Private Framework for Deep Learning with Convexified Loss Functions

Differential privacy (DP) has been applied in deep learning for preservi...
research
08/28/2018

Concentrated Differentially Private Gradient Descent with Adaptive per-Iteration Privacy Budget

Iterative algorithms, like gradient descent, are common tools for solvin...
research
06/27/2020

Understanding Gradient Clipping in Private SGD: A Geometric Perspective

Deep learning models are increasingly popular in many machine learning a...

Please sign up or login with your details

Forgot password? Click here to reset