On rate optimal private regression under local differential privacy

05/31/2022
by   László Györfi, et al.
0

We consider the problem of estimating a regression function from anonymised data in the framework of local differential privacy. We propose a novel partitioning estimate of the regression function, derive a rate of convergence for the excess prediction risk over Hölder classes, and prove a matching lower bound. In contrast to the existing literature no extra assumption on the design distribution as compared to the setup without anonymisation is needed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2022

Robust Estimation of Discrete Distributions under Local Differential Privacy

Although robust learning and local differential privacy are both widely ...
research
03/05/2019

Local differential privacy: Elbow effect in optimal density estimation and adaptation over Besov ellipsoids

We address the problem of non-parametric density estimation under the ad...
research
03/27/2019

Differential Privacy of Aggregated DC Optimal Power Flow Data

We consider the problem of privately releasing aggregated network statis...
research
03/13/2023

Score Attack: A Lower Bound Technique for Optimal Differentially Private Learning

Achieving optimal statistical performance while ensuring the privacy of ...
research
05/16/2020

Near Instance-Optimality in Differential Privacy

We develop two notions of instance optimality in differential privacy, i...
research
10/31/2020

Strongly universally consistent nonparametric regression and classification with privatised data

In this paper we revisit the classical problem of nonparametric regressi...
research
12/22/2019

Estimation of Spectral Risk Measures

We consider the problem of estimating a spectral risk measure (SRM) from...

Please sign up or login with your details

Forgot password? Click here to reset