Contextualize differential privacy in image database: a lightweight image differential privacy approach based on principle component analysis inverse

02/16/2022
by   Shiliang Zhang, et al.
1

Differential privacy (DP) has been the de-facto standard to preserve privacy-sensitive information in database. Nevertheless, there lacks a clear and convincing contextualization of DP in image database, where individual images' indistinguishable contribution to a certain analysis can be achieved and observed when DP is exerted. As a result, the privacy-accuracy trade-off due to integrating DP is insufficiently demonstrated in the context of differentially-private image database. This work aims at contextualizing DP in image database by an explicit and intuitive demonstration of integrating conceptional differential privacy with images. To this end, we design a lightweight approach dedicating to privatizing image database as a whole and preserving the statistical semantics of the image database to an adjustable level, while making individual images' contribution to such statistics indistinguishable. The designed approach leverages principle component analysis (PCA) to reduce the raw image with large amount of attributes to a lower dimensional space whereby DP is performed, so as to decrease the DP load of calculating sensitivity attribute-by-attribute. The DP-exerted image data, which is not visible in its privatized format, is visualized through PCA inverse such that both a human and machine inspector can evaluate the privatization and quantify the privacy-accuracy trade-off in an analysis on the privatized image database. Using the devised approach, we demonstrate the contextualization of DP in images by two use cases based on deep learning models, where we show the indistinguishability of individual images induced by DP and the privatized images' retention of statistical semantics in deep learning tasks, which is elaborated by quantitative analyses on the privacy-accuracy trade-off under different privatization settings.

READ FULL TEXT

page 8

page 9

page 11

page 12

research
11/04/2020

The Limits of Differential Privacy (and its Misuse in Data Release and Machine Learning)

Differential privacy (DP) is a neat privacy definition that can co-exist...
research
05/28/2019

Differential Privacy Has Disparate Impact on Model Accuracy

Differential privacy (DP) is a popular mechanism for training machine le...
research
03/12/2021

DP-Image: Differential Privacy for Image Data in Feature Space

The excessive use of images in social networks, government databases, an...
research
01/30/2019

Benefits and Pitfalls of the Exponential Mechanism with Applications to Hilbert Spaces and Functional PCA

The exponential mechanism is a fundamental tool of Differential Privacy ...
research
09/22/2021

Partial sensitivity analysis in differential privacy

Differential privacy (DP) allows the quantification of privacy loss when...
research
07/09/2021

Sensitivity analysis in differentially private machine learning using hybrid automatic differentiation

In recent years, formal methods of privacy protection such as differenti...
research
07/24/2017

Per-instance Differential Privacy and the Adaptivity of Posterior Sampling in Linear and Ridge regression

Differential privacy (DP), ever since its advent, has been a controversi...

Please sign up or login with your details

Forgot password? Click here to reset