Backdoor Attacks on Facial Recognition in the Physical World

06/25/2020
by   Emily Wenger, et al.
0

Backdoor attacks embed hidden malicious behaviors inside deep neural networks (DNNs) that are only activated when a specific "trigger" is present on some input to the model. A variety of these attacks have been successfully proposed and evaluated, generally using digitally generated patterns or images as triggers. Despite significant prior work on the topic, a key question remains unanswered: "can backdoor attacks be physically realized in the real world, and what limitations do attackers face in executing them?" In this paper, we present results of a detailed study on DNN backdoor attacks in the physical world, specifically focused on the task of facial recognition. We take 3205 photographs of 10 volunteers in a variety of settings and backgrounds and train a facial recognition model using transfer learning from VGGFace. We evaluate the effectiveness of 9 accessories as potential triggers, and analyze impact from external factors such as lighting and image quality. First, we find that triggers vary significantly in efficacy and a key factor is that facial recognition models are heavily tuned to features on the face and less so to features around the periphery. Second, the efficacy of most trigger objects is. negatively impacted by lower image quality but unaffected by lighting. Third, most triggers suffer from false positives, where non-trigger objects unintentionally activate the backdoor. Finally, we evaluate 4 backdoor defenses against physical backdoors. We show that they all perform poorly because physical triggers break key assumptions they made based on triggers in the digital domain. Our key takeaway is that implementing physical backdoors is much more challenging than described in literature for both attackers and defenders and much more work is necessary to understand how backdoors work in the real world.

READ FULL TEXT

page 1

page 5

page 7

page 9

page 10

page 11

page 12

page 15

research
04/15/2021

Robust Backdoor Attacks against Deep Neural Networks in Real Physical World

Deep neural networks (DNN) have been widely deployed in various practica...
research
05/24/2019

Regula Sub-rosa: Latent Backdoor Attacks on Deep Neural Networks

Recent work has proposed the concept of backdoor attacks on deep neural ...
research
06/20/2020

FaceHack: Triggering backdoored facial recognition systems using facial characteristics

Recent advances in Machine Learning (ML) have opened up new avenues for ...
research
09/14/2021

Dodging Attack Using Carefully Crafted Natural Makeup

Deep learning face recognition models are used by state-of-the-art surve...
research
06/21/2022

Natural Backdoor Datasets

Extensive literature on backdoor poison attacks has studied attacks and ...
research
12/18/2020

Robustness of Facial Recognition to GAN-based Face-morphing Attacks

Face-morphing attacks have been a cause for concern for a number of year...
research
03/15/2016

Effective Computer Model For Recognizing Nationality From Frontal Image

We are introducing new effective computer model for extracting nationali...

Please sign up or login with your details

Forgot password? Click here to reset