Convergence of Gradient Descent with Linearly Correlated Noise and Applications to Differentially Private Learning

02/02/2023
by   Anastasia Koloskova, et al.
0

We study stochastic optimization with linearly correlated noise. Our study is motivated by recent methods for optimization with differential privacy (DP), such as DP-FTRL, which inject noise via matrix factorization mechanisms. We propose an optimization problem that distils key facets of these DP methods and that involves perturbing gradients by linearly correlated noise. We derive improved convergence rates for gradient descent in this framework for convex and non-convex loss functions. Our theoretical analysis is novel and might be of independent interest. We use these convergence rates to develop new, effective matrix factorizations for differentially private optimization, and highlight the benefits of these factorizations theoretically and empirically.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset