When the signal is in the noise: The limits of Diffix's sticky noise
Finding a balance between privacy and utility, allowing researchers and businesses to use data for good while protecting people's privacy, is one of the biggest challenge we face today. A large body of research has shown the limits of the traditional anonymization (or de-identification) model prompting the use of question and answer or query-based systems. Diffix is a query-based system developed by Aircloak using the concept of "sticky noise" to protect people's privacy. We here present an attack on Diffix that exploits the structure of its sticky noise to infer private attributes of people in the dataset. We believe this vulnerability to be serious, allowing us to accurately infer private information of users with little background knowledge. While we share Diffix's creators view that we need to take a fresh look at building practical privacy-preserving systems, we believe this requires a layered security approach and fully open tools and discussions. Patented and proprietary code is unlikely to be sufficient to truly help us find a balance between the great potential of data and the basic human right of privacy.
READ FULL TEXT