Robust Physical-World Attacks on Face Recognition

09/20/2021
by   Xin Zheng, et al.
0

Face recognition has been greatly facilitated by the development of deep neural networks (DNNs) and has been widely applied to many safety-critical applications. However, recent studies have shown that DNNs are very vulnerable to adversarial examples, raising serious concerns on the security of real-world face recognition. In this work, we study sticker-based physical attacks on face recognition for better understanding its adversarial robustness. To this end, we first analyze in-depth the complicated physical-world conditions confronted by attacking face recognition, including the different variations of stickers, faces, and environmental conditions. Then, we propose a novel robust physical attack framework, dubbed PadvFace, to model these challenging variations specifically. Furthermore, considering the difference in attack complexity, we propose an efficient Curriculum Adversarial Attack (CAA) algorithm that gradually adapts adversarial stickers to environmental variations from easy to complex. Finally, we construct a standardized testing protocol to facilitate the fair evaluation of physical attacks on face recognition, and extensive experiments on both dodging and impersonation attacks demonstrate the superior performance of the proposed method.

READ FULL TEXT

page 3

page 9

research
11/27/2020

Robust Attacks on Deep Learning Face Recognition in the Physical World

Deep neural networks (DNNs) have been increasingly used in face recognit...
research
03/09/2022

Controllable Evaluation and Generation of Physical Adversarial Patch on Face Recognition

Recent studies have revealed the vulnerability of face recognition model...
research
05/26/2022

A Physical-World Adversarial Attack Against 3D Face Recognition

3D face recognition systems have been widely employed in intelligent ter...
research
12/31/2017

Adversarial Generative Nets: Neural Network Attacks on State-of-the-Art Face Recognition

In this paper we show that misclassification attacks against face-recogn...
research
09/29/2021

On Brightness Agnostic Adversarial Examples Against Face Recognition Systems

This paper introduces a novel adversarial example generation method agai...
research
04/14/2021

Meaningful Adversarial Stickers for Face Recognition in Physical World

Face recognition (FR) systems have been widely applied in safety-critica...
research
09/04/2021

Real-World Adversarial Examples involving Makeup Application

Deep neural networks have developed rapidly and have achieved outstandin...

Please sign up or login with your details

Forgot password? Click here to reset