Regularized Loss Minimizers with Local Data Obfuscation

05/19/2018
by   Zitao Li, et al.
0

While data privacy has been studied for more than a decade, it is still unclear how data privacy will counter data utility. We focus on the balance of data privacy and data utility in this paper. We show that there are several regularized loss minimization problems that can use locally perturbed data with theoretical guarantees for loss consistency. Our result quantitatively connects the convergence rate of the learning problems to the impossibility of any adversary for recovering the original data from perturbed observations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2012

Privacy Aware Learning

We study statistical risk minimization problems under a privacy model in...
research
06/06/2018

Improving the Privacy and Accuracy of ADMM-Based Distributed Algorithms

Alternating direction method of multiplier (ADMM) is a popular method us...
research
06/07/2020

BUDS: Balancing Utility and Differential Privacy by Shuffling

Balancing utility and differential privacy by shuffling or BUDS is an ap...
research
06/15/2018

Efficient Data Perturbation for Privacy Preserving and Accurate Data Stream Mining

The widespread use of the Internet of Things (IoT) has raised many conce...
research
10/22/2020

A Differentially Private Text Perturbation Method Using a Regularized Mahalanobis Metric

Balancing the privacy-utility tradeoff is a crucial requirement of many ...
research
05/04/2022

Uncertainty-Autoencoder-Based Privacy and Utility Preserving Data Type Conscious Transformation

We propose an adversarial learning framework that deals with the privacy...
research
09/04/2023

Revealing the True Cost of Local Privacy: An Auditing Perspective

This paper introduces the LDP-Auditor framework for empirically estimati...

Please sign up or login with your details

Forgot password? Click here to reset