Biometric Backdoors: A Poisoning Attack Against Unsupervised Template Updating

05/22/2019
by   Giulio Lovisotto, et al.
0

In this work, we investigate the concept of biometric backdoors: a template poisoning attack on biometric systems that allows adversaries to stealthily and effortlessly impersonate users in the long-term by exploiting the template update procedure. We show that such attacks can be carried out even by attackers with physical limitations (no digital access to the sensor) and zero knowledge of training data (they know neither decision boundaries nor user template). Based on the adversaries' own templates, they craft several intermediate samples that incrementally bridge the distance between their own template and the legitimate user's. As these adversarial samples are added to the template, the attacker is eventually accepted alongside the legitimate user. To avoid detection, we design the attack to minimize the number of rejected samples. We design our method to cope with the weak assumptions for the attacker and we evaluate the effectiveness of this approach on state-of-the-art face recognition pipelines based on deep neural networks. We find that in scenarios where the deep network is known, adversaries can successfully carry out the attack over 70 black-box scenarios, we find that exploiting the transferability of adversarial samples from surrogate models can lead to successful attacks in around 15 cases. Finally, we design a poisoning detection technique that leverages the consistent directionality of template updates in feature space to discriminate between legitimate and malicious updates. We evaluate such a countermeasure with a set of intra-user variability factors which may present the same directionality characteristics, obtaining equal error rates for the detection between 7-14 sample injections.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 7

page 8

page 11

12/27/2017

Exploring the Space of Black-box Attacks on Deep Neural Networks

Existing black-box attacks on deep neural networks (DNNs) so far have la...
07/24/2021

Biometric Masterkeys

Biometric authentication is used to secure digital or physical access. S...
01/13/2020

On the Resilience of Biometric Authentication Systems against Random Inputs

We assess the security of machine learning based biometric authenticatio...
05/08/2022

Fingerprint Template Invertibility: Minutiae vs. Deep Templates

Much of the success of fingerprint recognition is attributed to minutiae...
05/06/2022

Near-collisions and their Impact on Biometric Security (long)

Biometric recognition encompasses two operating modes. The first one is ...
01/26/2021

Biometric Verification Secure Against Malicious Adversaries

Biometric verification has been widely deployed in current authenticatio...
08/07/2017

Multibiometric Secure System Based on Deep Learning

In this paper, we propose a secure multibiometric system that uses deep ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.