Differentially Private SGD with Non-Smooth Loss

01/22/2021
by   Puyu Wang, et al.
0

In this paper, we are concerned with differentially private SGD algorithms in the setting of stochastic convex optimization (SCO). Most of existing work requires the loss to be Lipschitz continuous and strongly smooth, and the model parameter to be uniformly bounded. However, these assumptions are restrictive as many popular losses violate these conditions including the hinge loss for SVM, the absolute loss in robust regression, and even the least square loss in an unbounded domain. We significantly relax these restrictive assumptions and establish privacy and generalization (utility) guarantees for private SGD algorithms using output and gradient perturbations associated with non-smooth convex losses. Specifically, the loss function is relaxed to have α-Hölder continuous gradient (referred to as α-Hölder smoothness) which instantiates the Lipschitz continuity (α=0) and strong smoothness (α=1). We prove that noisy SGD with α-Hölder smooth losses using gradient perturbation can guarantee (ϵ,δ)-differential privacy (DP) and attain optimal excess population risk O(√(dlog(1/δ))/nϵ+1/√(n)), up to logarithmic terms, with gradient complexity (i.e. the total number of iterations) T =O( n^2-α 1+α+ n). This shows an important trade-off between α-Hölder smoothness of the loss and the computational complexity T for private SGD with statistically optimal performance. In particular, our results indicate that α-Hölder smoothness with α≥1/2 is sufficient to guarantee (ϵ,δ)-DP of noisy SGD algorithms while achieving optimal excess risk with linear gradient complexity T = O(n).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2021

Non-Euclidean Differentially Private Stochastic Convex Optimization

Differentially private (DP) stochastic convex optimization (SCO) is a fu...
research
09/09/2022

Differentially Private Stochastic Gradient Descent with Low-Noise

In this paper, by introducing a low-noise condition, we study privacy an...
research
09/03/2019

Differentially Private Objective Perturbation: Beyond Smoothness and Convexity

One of the most effective algorithms for differentially private learning...
research
06/01/2022

Bring Your Own Algorithm for Optimal Differentially Private Stochastic Minimax Optimization

We study differentially private (DP) algorithms for smooth stochastic mi...
research
04/22/2022

Sharper Utility Bounds for Differentially Private Models

In this paper, by introducing Generalized Bernstein condition, we propos...
research
05/17/2023

Privacy Loss of Noisy Stochastic Gradient Descent Might Converge Even for Non-Convex Losses

The Noisy-SGD algorithm is widely used for privately training machine le...
research
07/20/2023

From Adaptive Query Release to Machine Unlearning

We formalize the problem of machine unlearning as design of efficient un...

Please sign up or login with your details

Forgot password? Click here to reset