Privacy protection based on mask template

by   Hao Wang, et al.

Powerful recognition algorithms are widely used in the Internet or important medical systems, which poses a serious threat to personal privacy. Although the law provides for diversity protection, e.g. The General Data Protection Regulation (GDPR) in Europe and Articles 1032 to 1039 of the civil code in China. However, as an important privacy disclosure event, biometric data is often hidden, which is difficult for the owner to detect and trace to the source. Human biometrics generally exist in images. In order to avoid the disclosure of personal privacy, we should prevent unauthorized recognition algorithms from acquiring the real features of the original image.



page 7

page 10

page 11

page 12

page 13

page 14

page 16

page 18


Digital identity, personal data and privacy protection

Privacy protection in digital databases does not demand that data should...

Privacy Aspects of Provenance Queries

Given a query result of a big database, why-provenance can be used to ca...

PEEPLL: Privacy-Enhanced Event Pseudonymisation with Limited Linkability

Pseudonymisation provides the means to reduce the privacy impact of moni...

Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models

Today's proliferation of powerful facial recognition models poses a real...

Single-Component Privacy Guarantees in Helper Data Systems and Sparse Coding

We investigate the privacy of two approaches to (biometric) template pro...

Annotation-Based Static Analysis for Personal Data Protection

This paper elaborates the use of static source code analysis in the cont...

Are My EHRs Private Enough? -Event-level Privacy Protection

Privacy is a major concern in sharing human subject data to researchers ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.