Learning to Noise: Application-Agnostic Data Sharing with Local Differential Privacy

10/23/2020
by   Alex Mansbridge, et al.
1

In recent years, the collection and sharing of individuals' private data has become commonplace in many industries. Local differential privacy (LDP) is a rigorous approach which uses a randomized algorithm to preserve privacy even from the database administrator, unlike the more standard central differential privacy. For LDP, when applying noise directly to high-dimensional data, the level of noise required all but entirely destroys data utility. In this paper we introduce a novel, application-agnostic privatization mechanism that leverages representation learning to overcome the prohibitive noise requirements of direct methods, while maintaining the strict guarantees of LDP. We further demonstrate that this privatization mechanism can be used to train machine learning algorithms across a range of applications, including private data collection, private novel-class classification, and the augmentation of clean datasets with additional privatized features. We achieve significant gains in performance on downstream classification tasks relative to benchmarks that noise the data directly, which are state-of-the-art in the context of application-agnostic LDP mechanisms for high-dimensional data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/27/2019

Local Differential Privacy: a tutorial

In the past decade analysis of big data has proven to be extremely valua...
research
05/23/2022

A normal approximation for joint frequency estimatation under Local Differential Privacy

In the recent years, Local Differential Privacy (LDP) has been one of th...
research
04/30/2021

Improved Matrix Gaussian Mechanism for Differential Privacy

The wide deployment of machine learning in recent years gives rise to a ...
research
10/05/2021

Task-aware Privacy Preservation for Multi-dimensional Data

Local differential privacy (LDP), a state-of-the-art technique for priva...
research
09/17/2020

The Limits of Pan Privacy and Shuffle Privacy for Learning and Estimation

There has been a recent wave of interest in intermediate trust models fo...
research
01/30/2019

Private Q-Learning with Functional Noise in Continuous Spaces

We consider privacy-preserving algorithms for deep reinforcement learnin...
research
10/14/2021

AHEAD: Adaptive Hierarchical Decomposition for Range Query under Local Differential Privacy

For protecting users' private data, local differential privacy (LDP) has...

Please sign up or login with your details

Forgot password? Click here to reset