DeepAI AI Chat
Log In Sign Up

A Differential Privacy Mechanism Design Under Matrix-Valued Query

by   Thee Chanyaswad, et al.

Traditionally, differential privacy mechanism design has been tailored for a scalar-valued query function. Although many mechanisms such as the Laplace and Gaussian mechanisms can be extended to a matrix-valued query function by adding i.i.d. noise to each element of the matrix, this method is often sub-optimal as it forfeits an opportunity to exploit the structural characteristics typically associated with matrix analysis. In this work, we consider the design of differential privacy mechanism specifically for a matrix-valued query function. The proposed solution is to utilize a matrix-variate noise, as opposed to the traditional scalar-valued noise. Particularly, we propose a novel differential privacy mechanism called the Matrix-Variate Gaussian (MVG) mechanism, which adds a matrix-valued noise drawn from a matrix-variate Gaussian distribution. We prove that the MVG mechanism preserves (ϵ,δ)-differential privacy, and show that it allows the structural characteristics of the matrix-valued query function to naturally be exploited. Furthermore, due to the multi-dimensional nature of the MVG mechanism and the matrix-valued query, we introduce the concept of directional noise, which can be utilized to mitigate the impact the noise has on the utility of the query. Finally, we demonstrate the performance of the MVG mechanism and the advantages of directional noise using three matrix-valued queries on three privacy-sensitive datasets. We find that the MVG mechanism notably outperforms four previous state-of-the-art approaches, and provides comparable utility to the non-private baseline. Our work thus presents a promising prospect for both future research and implementation of differential privacy for matrix-valued query functions.


page 1

page 2

page 3

page 4


MVG Mechanism: Differential Privacy under Matrix-Valued Query

Differential privacy mechanism design has traditionally been tailored fo...

Improved Matrix Gaussian Mechanism for Differential Privacy

The wide deployment of machine learning in recent years gives rise to a ...

Optimal Noise-Adding Mechanism in Additive Differential Privacy

We derive the optimal (0, δ)-differentially private query-output indepen...

Truncated Laplacian Mechanism for Approximate Differential Privacy

We derive a class of noise probability distributions to preserve (ϵ, δ)-...

Introducing the Huber mechanism for differentially private low-rank matrix completion

Performing low-rank matrix completion with sensitive user data calls for...

Less is More: Revisiting Gaussian Mechanism for Differential Privacy

In this paper, we identify that the classic Gaussian mechanism and its v...

The Laplace Mechanism has optimal utility for differential privacy over continuous queries

Differential Privacy protects individuals' data when statistical queries...