Privacy Aware Learning

10/07/2012
by   John C. Duchi, et al.
0

We study statistical risk minimization problems under a privacy model in which the data is kept confidential even from the learner. In this local privacy framework, we establish sharp upper and lower bounds on the convergence rates of statistical estimation procedures. As a consequence, we exhibit a precise tradeoff between the amount of privacy the data preserves and the utility, as measured by convergence rate, of any statistical estimator or learning procedure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2012

Convergence Rates for Differentially Private Statistical Estimation

Differential privacy is a cryptographically-motivated definition of priv...
research
05/19/2018

Regularized Loss Minimizers with Local Data Obfuscation

While data privacy has been studied for more than a decade, it is still ...
research
09/09/2016

Singularity structures and impacts on parameter estimation in finite mixtures of distributions

Singularities of a statistical model are the elements of the model's par...
research
12/17/2019

Nonparametric density estimation for intentionally corrupted functional data

We consider statistical models where functional data are artificially co...
research
05/17/2023

Minimax rate for multivariate data under componentwise local differential privacy constraints

Our research delves into the balance between maintaining privacy and pre...
research
07/21/2023

Epsilon*: Privacy Metric for Machine Learning Models

We introduce Epsilon*, a new privacy metric for measuring the privacy ri...
research
06/28/2021

Instance-optimality in optimal value estimation: Adaptivity via variance-reduced Q-learning

Various algorithms in reinforcement learning exhibit dramatic variabilit...

Please sign up or login with your details

Forgot password? Click here to reset