Developing and Defeating Adversarial Examples

08/23/2020
by   Ian McDiarmid-Sterling, et al.
15

Breakthroughs in machine learning have resulted in state-of-the-art deep neural networks (DNNs) performing classification tasks in safety-critical applications. Recent research has demonstrated that DNNs can be attacked through adversarial examples, which are small perturbations to input data that cause the DNN to misclassify objects. The proliferation of DNNs raises important safety concerns about designing systems that are robust to adversarial examples. In this work we develop adversarial examples to attack the Yolo V3 object detector [1] and then study strategies to detect and neutralize these examples. Python code for this project is available at https://github.com/ianmcdiarmidsterling/adversarial

READ FULL TEXT

page 2

page 3

page 4

page 5

page 7

research
04/20/2020

GraN: An Efficient Gradient-Norm Based Detector for Adversarial and Misclassified Examples

Deep neural networks (DNNs) are vulnerable to adversarial examples and o...
research
09/04/2023

Adv3D: Generating 3D Adversarial Examples in Driving Scenarios with NeRF

Deep neural networks (DNNs) have been proven extremely susceptible to ad...
research
10/27/2019

Understanding and Quantifying Adversarial Examples Existence in Linear Classification

State-of-art deep neural networks (DNN) are vulnerable to attacks by adv...
research
08/10/2021

Enhancing Knowledge Tracing via Adversarial Training

We study the problem of knowledge tracing (KT) where the goal is to trac...
research
05/21/2022

Gradient Concealment: Free Lunch for Defending Adversarial Attacks

Recent studies show that the deep neural networks (DNNs) have achieved g...
research
11/18/2022

Diagnostics for Deep Neural Networks with Automated Copy/Paste Attacks

Deep neural networks (DNNs) are powerful, but they can make mistakes tha...
research
11/16/2022

Efficiently Finding Adversarial Examples with DNN Preprocessing

Deep Neural Networks (DNNs) are everywhere, frequently performing a fair...

Please sign up or login with your details

Forgot password? Click here to reset