AnomiGAN: Generative adversarial networks for anonymizing private medical data

01/31/2019
by   Ho Bae, et al.
0

Typical personal medical data contains sensitive information about individuals. Storing or sharing the personal medical data is thus often risky. For example, a short DNA sequence can provide information that can not only identify an individual, but also his or her relatives. Nonetheless, most countries and researchers agree on the necessity of collecting personal medical data. This stems from the fact that medical data, including genomic data, are an indispensable resource for further research and development regarding disease prevention and treatment. To prevent personal medical data from being misused, techniques to reliably preserve sensitive information should be developed for real world application. In this paper, we propose a framework called anonymized generative adversarial networks (AnomiGAN), to improve the maintenance of privacy of personal medical data, while also maintaining high prediction performance. We compared our method to state-of-the-art techniques and observed that our method preserves the same level of privacy as differential privacy (DP), but had better prediction results. We also observed that there is a trade-off between privacy and performance results depending on the degree of preservation of the original data. Here, we provide a mathematical overview of our proposed model and demonstrate its validation using UCI machine learning repository datasets in order to highlight its utility in practice. Experimentally, our approach delivers a better performance compared to that of the DP approach.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 7

page 8

page 9

research
03/31/2020

Towards Effective Differential Privacy Communication for Users' Data Sharing Decision and Comprehension

Differential privacy protects an individual's privacy by perturbing data...
research
12/26/2018

Protecting Sensitive Attributes via Generative Adversarial Networks

Recent advances in computing have allowed for the possibility to collect...
research
06/09/2021

Near-Optimal Privacy-Utility Tradeoff in Genomic Studies Using Selective SNP Hiding

Motivation: Researchers need a rich trove of genomic datasets that they ...
research
06/14/2020

Adversarial representation learning for synthetic replacement of private attributes

The collection of large datasets allows for advanced analytics that can ...
research
09/09/2022

Bridging the Gap: Differentially Private Equivariant Deep Learning for Medical Image Analysis

Machine learning with formal privacy-preserving techniques like Differen...
research
02/23/2020

PrivGen: Preserving Privacy of Sequences Through Data Generation

Sequential data is everywhere, and it can serve as a basis for research ...
research
06/24/2021

When Differential Privacy Meets Interpretability: A Case Study

Given the increase in the use of personal data for training Deep Neural ...

Please sign up or login with your details

Forgot password? Click here to reset