Privacy protection based on mask template

02/13/2022
by   Hao Wang, et al.
2

Powerful recognition algorithms are widely used in the Internet or important medical systems, which poses a serious threat to personal privacy. Although the law provides for diversity protection, e.g. The General Data Protection Regulation (GDPR) in Europe and Articles 1032 to 1039 of the civil code in China. However, as an important privacy disclosure event, biometric data is often hidden, which is difficult for the owner to detect and trace to the source. Human biometrics generally exist in images. In order to avoid the disclosure of personal privacy, we should prevent unauthorized recognition algorithms from acquiring the real features of the original image.

READ FULL TEXT

Authors

page 7

page 10

page 11

page 12

page 13

page 14

page 16

page 18

01/16/2018

Digital identity, personal data and privacy protection

Privacy protection in digital databases does not demand that data should...
01/12/2021

Privacy Aspects of Provenance Queries

Given a query result of a big database, why-provenance can be used to ca...
12/12/2019

PEEPLL: Privacy-Enhanced Event Pseudonymisation with Limited Linkability

Pseudonymisation provides the means to reduce the privacy impact of moni...
02/19/2020

Fawkes: Protecting Personal Privacy against Unauthorized Deep Learning Models

Today's proliferation of powerful facial recognition models poses a real...
07/15/2019

Single-Component Privacy Guarantees in Helper Data Systems and Sparse Coding

We investigate the privacy of two approaches to (biometric) template pro...
03/22/2020

Annotation-Based Static Analysis for Personal Data Protection

This paper elaborates the use of static source code analysis in the cont...
06/12/2018

Are My EHRs Private Enough? -Event-level Privacy Protection

Privacy is a major concern in sharing human subject data to researchers ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.