Efficient Private SCO for Heavy-Tailed Data via Clipping
We consider stochastic convex optimization for heavy-tailed data with the guarantee of being differentially private (DP). Prior work on this problem is restricted to the gradient descent (GD) method, which is inefficient for large-scale problems. In this paper, we resolve this issue and derive the first high-probability bounds for private stochastic method with clipping. For general convex problems, we derive excess population risks O(d^1/7√(ln(n ϵ)^2/β d)/(nϵ)^2/7) and O(d^1/7ln(nϵ)^2/β d/(nϵ)^2/7) under bounded or unbounded domain assumption, respectively (here n is the sample size, d is the dimension of the data, β is the confidence level and ϵ is the private level). Then, we extend our analysis to the strongly convex case and non-smooth case (which works for generalized smooth objectives with Hölder-continuous gradients). We establish new excess risk bounds without bounded domain assumption. The results above achieve lower excess risks and gradient complexities than existing methods in their corresponding cases. Numerical experiments are conducted to justify the theoretical improvement.
READ FULL TEXT