Differentially Private ℓ_1-norm Linear Regression with Heavy-tailed Data

01/10/2022
by   Di Wang, et al.
0

We study the problem of Differentially Private Stochastic Convex Optimization (DP-SCO) with heavy-tailed data. Specifically, we focus on the ℓ_1-norm linear regression in the ϵ-DP model. While most of the previous work focuses on the case where the loss function is Lipschitz, here we only need to assume the variates has bounded moments. Firstly, we study the case where the ℓ_2 norm of data has bounded second order moment. We propose an algorithm which is based on the exponential mechanism and show that it is possible to achieve an upper bound of Õ(√(d/nϵ)) (with high probability). Next, we relax the assumption to bounded θ-th order moment with some θ∈ (1, 2) and show that it is possible to achieve an upper bound of Õ((d/nϵ)^θ-1/θ). Our algorithms can also be extended to more relaxed cases where only each coordinate of the data has bounded moments, and we can get an upper bound of Õ(d/√(nϵ)) and Õ(d/(nϵ)^θ-1/θ) in the second and θ-th moment case respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset