Non-Euclidean Differentially Private Stochastic Convex Optimization

03/01/2021
by   Raef Bassily, et al.
0

Differentially private (DP) stochastic convex optimization (SCO) is a fundamental problem, where the goal is to approximately minimize the population risk with respect to a convex loss function, given a dataset of i.i.d. samples from a distribution, while satisfying differential privacy with respect to the dataset. Most of the existing works in the literature of private convex optimization focus on the Euclidean (i.e., ℓ_2) setting, where the loss is assumed to be Lipschitz (and possibly smooth) w.r.t. the ℓ_2 norm over a constraint set with bounded ℓ_2 diameter. Algorithms based on noisy stochastic gradient descent (SGD) are known to attain the optimal excess risk in this setting. In this work, we conduct a systematic study of DP-SCO for ℓ_p-setups. For p=1, under a standard smoothness assumption, we give a new algorithm with nearly optimal excess risk. This result also extends to general polyhedral norms and feasible sets. For p∈(1, 2), we give two new algorithms, whose central building block is a novel privacy mechanism, which generalizes the Gaussian mechanism. Moreover, we establish a lower bound on the excess risk for this range of p, showing a necessary dependence on √(d), where d is the dimension of the space. Our lower bound implies a sudden transition of the excess risk at p=1, where the dependence on d changes from logarithmic to polynomial, resolving an open question in prior work [TTZ15] . For p∈ (2, ∞), noisy SGD attains optimal excess risk in the low-dimensional regime; in particular, this proves the optimality of noisy SGD for p=∞. Our work draws upon concepts from the geometry of normed spaces, such as the notions of regularity, uniform convexity, and uniform smoothness.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/18/2022

Private Convex Optimization in General Norms

We propose a new framework for differentially private optimization of co...
research
01/22/2021

Differentially Private SGD with Non-Smooth Loss

In this paper, we are concerned with differentially private SGD algorith...
research
04/07/2021

Optimal Algorithms for Differentially Private Stochastic Monotone Variational Inequalities and Saddle-Point Problems

In this work, we conduct the first systematic study of stochastic variat...
research
08/15/2016

Generalization of ERM in Stochastic Convex Optimization: The Dimension Strikes Back

In stochastic convex optimization the goal is to minimize a convex funct...
research
04/04/2022

Langevin Diffusion: An Almost Universal Algorithm for Private Euclidean (Convex) Optimization

In this paper we revisit the problem of differentially private empirical...
research
03/01/2022

Private Convex Optimization via Exponential Mechanism

In this paper, we study private optimization problems for non-smooth con...
research
02/25/2021

Machine Unlearning via Algorithmic Stability

We study the problem of machine unlearning and identify a notion of algo...

Please sign up or login with your details

Forgot password? Click here to reset