Classification under local differential privacy

12/10/2019
by   Thomas Berrett, et al.
12

We consider the binary classification problem in a setup that preserves the privacy of the original sample. We provide a privacy mechanism that is locally differentially private and then construct a classifier based on the private sample that is universally consistent in Euclidean spaces. Under stronger assumptions, we establish the minimax rates of convergence of the excess risk and see that they are slower than in the case when the original sample is available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/06/2022

Learning to be adversarially robust and differentially private

We study the difficulties in learning that arise from robust and differe...
research
04/21/2018

Differentially Private k-Means with Constant Multiplicative Error

We design new differentially private algorithms for the Euclidean k-mean...
research
10/31/2020

Strongly universally consistent nonparametric regression and classification with privatised data

In this paper we revisit the classical problem of nonparametric regressi...
research
02/19/2023

Sample-efficient private data release for Lipschitz functions under sparsity assumptions

Differential privacy is the de facto standard for protecting privacy in ...
research
05/03/2018

Geometrizing rates of convergence under differential privacy constraints

We study estimation of a functional θ( P) of an unknown probability dist...
research
05/31/2017

Local Differential Privacy for Physical Sensor Data and Sparse Recovery

In this work we explore the utility of locally differentially private th...
research
08/08/2022

Differentially Private Fréchet Mean on the Manifold of Symmetric Positive Definite (SPD) Matrices

Differential privacy has become crucial in the real-world deployment of ...

Please sign up or login with your details

Forgot password? Click here to reset