DeepSearch: Simple and Effective Blackbox Fuzzing of Deep Neural Networks

10/14/2019
by   Fuyuan Zhang, et al.
0

Although deep neural networks have been successful in image classification, they are prone to adversarial attacks. To generate misclassified inputs, there has emerged a wide variety of techniques, such as black- and whitebox testing of neural networks. In this paper, we present DeepSearch, a novel blackbox-fuzzing technique for image classifiers. Despite its simplicity, DeepSearch is shown to be more effective in finding adversarial examples than closely related black- and whitebox approaches. DeepSearch is additionally able to generate the most subtle adversarial examples in comparison to these approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/19/2017

Adversarial Examples: Attacks and Defenses for Deep Learning

With rapid progress and great successes in a wide spectrum of applicatio...
research
05/14/2019

Interpretable Deep Neural Networks for Patient Mortality Prediction: A Consensus-based Approach

Deep neural networks have achieved remarkable success in challenging tas...
research
05/22/2019

Detecting Adversarial Examples and Other Misclassifications in Neural Networks by Introspection

Despite having excellent performances for a wide variety of tasks, moder...
research
09/02/2021

Building Compact and Robust Deep Neural Networks with Toeplitz Matrices

Deep neural networks are state-of-the-art in a wide variety of tasks, ho...
research
10/21/2017

Feature-Guided Black-Box Safety Testing of Deep Neural Networks

Despite the improved accuracy of deep neural networks, the discovery of ...
research
06/28/2023

Does Saliency-Based Training bring Robustness for Deep Neural Networks in Image Classification?

Deep Neural Networks are powerful tools to understand complex patterns a...
research
05/13/2019

Adversarial Examples for Electrocardiograms

Among all physiological signals, electrocardiogram (ECG) has seen some o...

Please sign up or login with your details

Forgot password? Click here to reset