Spectral-DP: Differentially Private Deep Learning through Spectral Perturbation and Filtering

07/25/2023
by   Ce Feng, et al.
0

Differential privacy is a widely accepted measure of privacy in the context of deep learning algorithms, and achieving it relies on a noisy training approach known as differentially private stochastic gradient descent (DP-SGD). DP-SGD requires direct noise addition to every gradient in a dense neural network, the privacy is achieved at a significant utility cost. In this work, we present Spectral-DP, a new differentially private learning approach which combines gradient perturbation in the spectral domain with spectral filtering to achieve a desired privacy guarantee with a lower noise scale and thus better utility. We develop differentially private deep learning methods based on Spectral-DP for architectures that contain both convolution and fully connected layers. In particular, for fully connected layers, we combine a block-circulant based spatial restructuring with Spectral-DP to achieve better utility. Through comprehensive experiments, we study and provide guidelines to implement Spectral-DP deep learning on benchmark datasets. In comparison with state-of-the-art DP-SGD based approaches, Spectral-DP is shown to have uniformly better utility performance in both training from scratch and transfer learning settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2023

Differentially Private Image Classification by Learning Priors from Random Processes

In privacy-preserving machine learning, differentially private stochasti...
research
11/26/2019

Gradient Perturbation is Underrated for Differentially Private Convex Optimization

Gradient perturbation, widely used for differentially private optimizati...
research
04/03/2022

A Differentially Private Framework for Deep Learning with Convexified Loss Functions

Differential privacy (DP) has been applied in deep learning for preservi...
research
11/09/2022

Directional Privacy for Deep Learning

Differentially Private Stochastic Gradient Descent (DP-SGD) is a key met...
research
02/10/2022

Backpropagation Clipping for Deep Learning with Differential Privacy

We present backpropagation clipping, a novel variant of differentially p...
research
05/09/2022

SmoothNets: Optimizing CNN architecture design for differentially private deep learning

The arguably most widely employed algorithm to train deep neural network...
research
07/19/2023

DP-TBART: A Transformer-based Autoregressive Model for Differentially Private Tabular Data Generation

The generation of synthetic tabular data that preserves differential pri...

Please sign up or login with your details

Forgot password? Click here to reset