An Abstract View on the De-anonymization Process

02/26/2019
by   Alexandros Bampoulidis, et al.
0

Over the recent years, the availability of datasets containing personal, but anonymized information has been continuously increasing. Extensive research has revealed that such datasets are vulnerable to privacy breaches: being able to reveal sensitive information about individuals through deanonymization methods. Here, we provide a taxonomy of the research in de-anonymization.

READ FULL TEXT

page 1

page 2

research
07/10/2020

Mechanisms for Hiding Sensitive Genotypes with Information-Theoretic Privacy

The growing availability of personal genomics services comes with increa...
research
05/25/2022

Are Large Pre-Trained Language Models Leaking Your Personal Information?

Large Pre-Trained Language Models (PLMs) have facilitated and dominated ...
research
02/23/2022

Privacy issues on biometric systems

In the XXIth century there is a strong interest on privacy issues. Techn...
research
08/12/2022

Is Your Model Sensitive? SPeDaC: A New Benchmark for Detecting and Classifying Sensitive Personal Data

In recent years we have seen the exponential growth of applications, inc...
research
10/28/2022

Towards Privacy Engineering for Real-Time Analytics in the Human-Centered Internet of Things

Big data applications offer smart solutions to many urgent societal chal...
research
04/05/2018

Lclean: A Plausible Approach to Individual Trajectory Data Sanitization

In recent years, with the continuous development of significant data ind...
research
08/30/2023

Grandma Karl is 27 years old – research agenda for pseudonymization of research data

Accessibility of research data is critical for advances in many research...

Please sign up or login with your details

Forgot password? Click here to reset