(Nearly) Optimal Private Linear Regression via Adaptive Clipping

07/11/2022
by   Prateek Varshney, et al.
5

We study the problem of differentially private linear regression where each data point is sampled from a fixed sub-Gaussian style distribution. We propose and analyze a one-pass mini-batch stochastic gradient descent method (DP-AMBSSGD) where points in each iteration are sampled without replacement. Noise is added for DP but the noise standard deviation is estimated online. Compared to existing (ϵ, δ)-DP techniques which have sub-optimal error bounds, DP-AMBSSGD is able to provide nearly optimal error bounds in terms of key parameters like dimensionality d, number of points N, and the standard deviation σ of the noise in observations. For example, when the d-dimensional covariates are sampled i.i.d. from the normal distribution, then the excess error of DP-AMBSSGD due to privacy is σ^2 d/N(1+d/ϵ^2 N), i.e., the error is meaningful when number of samples N= Ω(d log d) which is the standard operative regime for linear regression. In contrast, error bounds for existing efficient methods in this setting are: 𝒪(d^3/ϵ^2 N^2), even for σ=0. That is, for constant ϵ, the existing techniques require N=Ω(d√(d)) to provide a non-trivial result.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2022

DP-PCA: Statistically Optimal and Differentially Private PCA

We study the canonical statistical task of computing the principal compo...
research
01/30/2023

Near Optimal Private and Robust Linear Regression

We study the canonical statistical estimation problem of linear regressi...
research
11/05/2021

Tight Bounds for Differentially Private Anonymized Histograms

In this note, we consider the problem of differentially privately (DP) c...
research
03/07/2018

Revisiting differentially private linear regression: optimal and adaptive prediction & estimation in unbounded domain

We revisit the problem of linear regression under a differential privacy...
research
01/10/2022

Differentially Private Generative Adversarial Networks with Model Inversion

To protect sensitive data in training a Generative Adversarial Network (...
research
04/07/2021

Optimal Algorithms for Differentially Private Stochastic Monotone Variational Inequalities and Saddle-Point Problems

In this work, we conduct the first systematic study of stochastic variat...
research
07/29/2020

Truncated Linear Regression in High Dimensions

As in standard linear regression, in truncated linear regression, we are...

Please sign up or login with your details

Forgot password? Click here to reset