Fairly Private Through Group Tagging and Relation Impact

05/15/2021
by   Poushali Sengupta, et al.
0

Privacy and Fairness both are very important nowadays. For most of the cases in the online service providing system, users have to share their personal information with the organizations. In return, the clients not only demand a high privacy guarantee to their sensitive data but also expected to be treated fairly irrespective of their age, gender, religion, race, skin color, or other sensitive protected attributes. Our work introduces a novel architecture that is balanced among the privacy-utility-fairness trade-off. The proposed mechanism applies Group Tagging Method and Fairly Iterative Shuffling (FIS) that amplifies privacy through random shuffling and prevents linkage attack. The algorithm introduces a fair classification problem by Relation Impact based on Equalized Minimal FPR-FNR among the protected tagged group. For the count report generation, the aggregator uses TF-IDF to add noise for providing longitudinal Differential Privacy guarantee. Lastly, the mechanism boosts the utility through risk minimization function and obtain the optimal privacy-utility budget of the system. In our work, we have done a case study on gender equality in the admission system and helps to obtain a satisfying result which implies that the proposed architecture achieves the group fairness and optimal privacy-utility trade-off for both the numerical and decision making Queries.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2018

An Intersectional Definition of Fairness

We introduce a measure of fairness for algorithms and data with regard t...
research
06/07/2020

BUDS: Balancing Utility and Differential Privacy by Shuffling

Balancing utility and differential privacy by shuffling or BUDS is an ap...
research
09/22/2022

In Differential Privacy, There is Truth: On Vote Leakage in Ensemble Private Learning

When learning from sensitive data, care must be taken to ensure that tra...
research
04/25/2023

(Local) Differential Privacy has NO Disparate Impact on Fairness

In recent years, Local Differential Privacy (LDP), a robust privacy-pres...
research
01/30/2023

The Fair Value of Data Under Heterogeneous Privacy Constraints

Modern data aggregation often takes the form of a platform collecting da...
research
12/07/2020

Improving Fairness and Privacy in Selection Problems

Supervised learning models have been increasingly used for making decisi...
research
10/28/2022

Fairness Certificates for Differentially Private Classification

In this work, we theoretically study the impact of differential privacy ...

Please sign up or login with your details

Forgot password? Click here to reset