DeepAI AI Chat
Log In Sign Up

Sampling from Log-Concave Distributions with Infinity-Distance Guarantees and Applications to Differentially Private Optimization

by   Oren Mangoubi, et al.

For a d-dimensional log-concave distribution π(θ)∝ e^-f(θ) on a polytope K, we consider the problem of outputting samples from a distribution ν which is O(ε)-close in infinity-distance sup_θ∈ K|logν(θ)/π(θ)| to π. Such samplers with infinity-distance guarantees are specifically desired for differentially private optimization as traditional sampling algorithms which come with total-variation distance or KL divergence bounds are insufficient to guarantee differential privacy. Our main result is an algorithm that outputs a point from a distribution O(ε)-close to π in infinity-distance and requires O((md+dL^2R^2)×(LR+dlog(Rd+LRd/ε r))× md^ω-1) arithmetic operations, where f is L-Lipschitz, K is defined by m inequalities, is contained in a ball of radius R and contains a ball of smaller radius r, and ω is the matrix-multiplication constant. In particular this runtime is logarithmic in 1/ε and significantly improves on prior works. Technically, we depart from the prior works that construct Markov chains on a 1/ε^2-discretization of K to achieve a sample with O(ε) infinity-distance error, and present a method to convert continuous samples from K with total-variation bounds to samples with infinity bounds. To achieve improved dependence on d, we present a "soft-threshold" version of the Dikin walk which may be of independent interest. Plugging our algorithm into the framework of the exponential mechanism yields similar improvements in the running time of ε-pure differentially private algorithms for optimization problems such as empirical risk minimization of Lipschitz-convex functions and low-rank approximation, while still achieving the tightest known utility bounds.


page 1

page 2

page 3

page 4


Faster Sampling from Log-Concave Distributions over Polytopes via a Soft-Threshold Dikin Walk

We consider the problem of sampling from a d-dimensional log-concave dis...

Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Various differentially private algorithms instantiate the exponential me...

Differentially Private Empirical Risk Minimization with Sparsity-Inducing Norms

Differential privacy is concerned about the prediction quality while mea...

A Polynomial Time, Pure Differentially Private Estimator for Binary Product Distributions

We present the first ε-differentially private, computationally efficient...

Efficient Mean Estimation with Pure Differential Privacy via a Sum-of-Squares Exponential Mechanism

We give the first polynomial-time algorithm to estimate the mean of a d-...

Private Hypothesis Selection

We provide a differentially private algorithm for hypothesis selection. ...

Contraction of Locally Differentially Private Mechanisms

We investigate the contraction properties of locally differentially priv...