Content-Aware Differential Privacy with Conditional Invertible Neural Networks

07/29/2022
by   Malte Tölle, et al.
0

Differential privacy (DP) has arisen as the gold standard in protecting an individual's privacy in datasets by adding calibrated noise to each data sample. While the application to categorical data is straightforward, its usability in the context of images has been limited. Contrary to categorical data the meaning of an image is inherent in the spatial correlation of neighboring pixels making the simple application of noise infeasible. Invertible Neural Networks (INN) have shown excellent generative performance while still providing the ability to quantify the exact likelihood. Their principle is based on transforming a complicated distribution into a simple one e.g. an image into a spherical Gaussian. We hypothesize that adding noise to the latent space of an INN can enable differentially private image modification. Manipulation of the latent space leads to a modified image while preserving important details. Further, by conditioning the INN on meta-data provided with the dataset we aim at leaving dimensions important for downstream tasks like classification untouched while altering other parts that potentially contain identifying information. We term our method content-aware differential privacy (CADP). We conduct experiments on publicly available benchmarking datasets as well as dedicated medical ones. In addition, we show the generalizability of our method to categorical data. The source code is publicly available at https://github.com/Cardio-AI/CADP.

READ FULL TEXT

page 2

page 7

page 9

page 12

page 13

research
03/12/2021

DP-Image: Differential Privacy for Image Data in Feature Space

The excessive use of images in social networks, government databases, an...
research
07/22/2021

Differentially Private Algorithms for 2020 Census Detailed DHC Race & Ethnicity

This article describes a proposed differentially private (DP) algorithms...
research
03/04/2020

Privacy-preserving Learning via Deep Net Pruning

This paper attempts to answer the question whether neural network prunin...
research
03/08/2021

Differentially Private Imaging via Latent Space Manipulation

There is growing concern about image privacy due to the popularity of so...
research
06/15/2023

ViP: A Differentially Private Foundation Model for Computer Vision

Artificial intelligence (AI) has seen a tremendous surge in capabilities...
research
06/07/2019

Computing Exact Guarantees for Differential Privacy

Quantification of the privacy loss associated with a randomised algorith...
research
03/23/2023

Disguise without Disruption: Utility-Preserving Face De-Identification

With the increasing ubiquity of cameras and smart sensors, humanity is g...

Please sign up or login with your details

Forgot password? Click here to reset