DP-MAC: The Differentially Private Method of Auxiliary Coordinates for Deep Learning

10/15/2019
by   Frederik Harder, et al.
11

Developing a differentially private deep learning algorithm is challenging, due to the difficulty in analyzing the sensitivity of objective functions that are typically used to train deep neural networks. Many existing methods resort to the stochastic gradient descent algorithm and apply a pre-defined sensitivity to the gradients for privatizing weights. However, their slow convergence typically yields a high cumulative privacy loss. Here, we take a different route by employing the method of auxiliary coordinates, which allows us to independently update the weights per layer by optimizing a per-layer objective function. This objective function can be well approximated by a low-order Taylor's expansion, in which sensitivity analysis becomes tractable. We perturb the coefficients of the expansion for privacy, which we optimize using more advanced optimization routines than SGD for faster convergence. We empirically show that our algorithm provides a decent trained model quality under a modest privacy budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2022

A Differentially Private Framework for Deep Learning with Convexified Loss Functions

Differential privacy (DP) has been applied in deep learning for preservi...
research
06/06/2022

Per-Instance Privacy Accounting for Differentially Private Stochastic Gradient Descent

Differentially private stochastic gradient descent (DP-SGD) is the workh...
research
06/12/2020

Differentially Private Stochastic Coordinate Descent

In this paper we tackle the challenge of making the stochastic coordinat...
research
07/09/2021

Differentially private training of neural networks with Langevin dynamics for calibrated predictive uncertainty

We show that differentially private stochastic gradient descent (DP-SGD)...
research
12/01/2022

Differentially Private Adaptive Optimization with Delayed Preconditioners

Privacy noise may negate the benefits of using adaptive optimizers in di...
research
06/15/2016

Bolt-on Differential Privacy for Scalable Stochastic Gradient Descent-based Analytics

While significant progress has been made separately on analytics systems...
research
03/09/2020

Flexible numerical optimization with ensmallen

This report provides an introduction to the ensmallen numerical optimiza...

Please sign up or login with your details

Forgot password? Click here to reset