RSTAM: An Effective Black-Box Impersonation Attack on Face Recognition using a Mobile and Compact Printer

06/25/2022
by   Xiaoliang Liu, et al.
6

Face recognition has achieved considerable progress in recent years thanks to the development of deep neural networks, but it has recently been discovered that deep neural networks are vulnerable to adversarial examples. This means that face recognition models or systems based on deep neural networks are also susceptible to adversarial examples. However, the existing methods of attacking face recognition models or systems with adversarial examples can effectively complete white-box attacks but not black-box impersonation attacks, physical attacks, or convenient attacks, particularly on commercial face recognition systems. In this paper, we propose a new method to attack face recognition models or systems called RSTAM, which enables an effective black-box impersonation attack using an adversarial mask printed by a mobile and compact printer. First, RSTAM enhances the transferability of the adversarial masks through our proposed random similarity transformation strategy. Furthermore, we propose a random meta-optimization strategy for ensembling several pre-trained face models to generate more general adversarial masks. Finally, we conduct experiments on the CelebA-HQ, LFW, Makeup Transfer (MT), and CASIA-FaceV5 datasets. The performance of the attacks is also evaluated on state-of-the-art commercial face recognition systems: Face++, Baidu, Aliyun, Tencent, and Microsoft. Extensive experiments show that RSTAM can effectively perform black-box impersonation attacks on face recognition models or systems.

READ FULL TEXT

page 4

page 7

page 9

research
05/07/2021

Adv-Makeup: A New Imperceptible and Transferable Attack on Face Recognition

Deep neural networks, particularly face recognition models, have been sh...
research
06/09/2022

ReFace: Real-time Adversarial Attacks on Face Recognition Systems

Deep neural network based face recognition models have been shown to be ...
research
10/15/2022

Is Face Recognition Safe from Realizable Attacks?

Face recognition is a popular form of biometric authentication and due t...
research
09/18/2023

Stealthy Physical Masked Face Recognition Attack via Adversarial Style Optimization

Deep neural networks (DNNs) have achieved state-of-the-art performance o...
research
01/28/2023

Semantic Adversarial Attacks on Face Recognition through Significant Attributes

Face recognition is known to be vulnerable to adversarial face images. E...
research
03/22/2023

Sibling-Attack: Rethinking Transferable Adversarial Attacks against Face Recognition

A hard challenge in developing practical face recognition (FR) attacks i...
research
03/28/2023

Towards Effective Adversarial Textured 3D Meshes on Physical Face Recognition

Face recognition is a prevailing authentication solution in numerous bio...

Please sign up or login with your details

Forgot password? Click here to reset