How reparametrization trick broke differentially-private text representation learning

02/24/2022
by   Ivan Habernal, et al.
0

As privacy gains traction in the NLP community, researchers have started adopting various approaches to privacy-preserving methods. One of the favorite privacy frameworks, differential privacy (DP), is perhaps the most compelling thanks to its fundamental theoretical guarantees. Despite the apparent simplicity of the general concept of differential privacy, it seems non-trivial to get it right when applying it to NLP. In this short paper, we formally analyze several recent NLP papers proposing text representation learning using DPText (Beigi et al., 2019a,b; Alnasser et al., 2021; Beigi et al., 2021) and reveal their false claims of being differentially private. Furthermore, we also show a simple yet general empirical sanity check to determine whether a given implementation of a DP mechanism almost certainly violates the privacy loss guarantees. Our main goal is to raise awareness and help the community understand potential pitfalls of applying differential privacy to text representation learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/01/2022

Differential Privacy Made Easy

Data privacy is a major issue for many decades, several techniques have ...
research
09/07/2021

When differential privacy meets NLP: The devil is in the detail

Differential privacy provides a formal approach to privacy of individual...
research
10/03/2020

Differentially Private Representation for NLP: Formal Guarantee and An Empirical Study on Privacy and Fairness

It has been demonstrated that hidden representation learned by a deep mo...
research
02/10/2021

Privacy-Preserving Graph Convolutional Networks for Text Classification

Graph convolutional networks (GCNs) are a powerful architecture for repr...
research
07/13/2023

To share or not to share: What risks would laypeople accept to give sensitive data to differentially-private NLP systems?

Although the NLP community has adopted central differential privacy as a...
research
08/22/2022

DP-Rewrite: Towards Reproducibility and Transparency in Differentially Private Text Rewriting

Text rewriting with differential privacy (DP) provides concrete theoreti...
research
01/29/2021

ADePT: Auto-encoder based Differentially Private Text Transformation

Privacy is an important concern when building statistical models on data...

Please sign up or login with your details

Forgot password? Click here to reset