Optimizing One-pixel Black-box Adversarial Attacks

04/30/2022
by   Tianxun Zhou, et al.
0

The output of Deep Neural Networks (DNN) can be altered by a small perturbation of the input in a black box setting by making multiple calls to the DNN. However, the high computation and time required makes the existing approaches unusable. This work seeks to improve the One-pixel (few-pixel) black-box adversarial attacks to reduce the number of calls to the network under attack. The One-pixel attack uses a non-gradient optimization algorithm to find pixel-level perturbations under the constraint of a fixed number of pixels, which causes the network to predict the wrong label for a given image. We show through experimental results how the choice of the optimization algorithm and initial positions to search can reduce function calls and increase attack success significantly, making the attack more practical in real-world settings.

READ FULL TEXT

page 3

page 5

page 6

research
01/19/2021

PICA: A Pixel Correlation-based Attentional Black-box Adversarial Attack

The studies on black-box adversarial attacks have become increasingly pr...
research
02/04/2022

Pixle: a fast and effective black-box attack based on rearranging pixels

Recent research has found that neural networks are vulnerable to several...
research
10/26/2022

LP-BFGS attack: An adversarial attack based on the Hessian with limited pixels

Deep neural networks are vulnerable to adversarial attacks. Most white-b...
research
01/13/2018

Black-box Generation of Adversarial Text Sequences to Evade Deep Learning Classifiers

Although various techniques have been proposed to generate adversarial s...
research
12/03/2020

An Empirical Study of Derivative-Free-Optimization Algorithms for Targeted Black-Box Attacks in Deep Neural Networks

We perform a comprehensive study on the performance of derivative free o...
research
12/21/2020

Blurring Fools the Network – Adversarial Attacks by Feature Peak Suppression and Gaussian Blurring

Existing pixel-level adversarial attacks on neural networks may be defic...
research
06/21/2019

Evolution Attack On Neural Networks

Many studies have been done to prove the vulnerability of neural network...

Please sign up or login with your details

Forgot password? Click here to reset