Downscaling Attack and Defense: Turning What You See Back Into What You Get

10/06/2020
by   Andrew J. Lohn, et al.
0

The resizing of images, which is typically a required part of preprocessing for computer vision systems, is vulnerable to attack. Images can be created such that the image is completely different at machine-vision scales than at other scales and the default settings for some common computer vision and machine learning systems are vulnerable. We show that defenses exist and are trivial to administer provided that defenders are aware of the threat. These attacks and defenses help to establish the role of input sanitization in machine learning.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 2

page 3

page 4

page 5

page 6

06/30/2021

Adversarial Machine Learning for Cybersecurity and Computer Vision: Current Developments and Challenges

We provide a comprehensive overview of adversarial machine learning focu...
01/28/2021

Adversarial Machine Learning Attacks on Condition-Based Maintenance Capabilities

Condition-based maintenance (CBM) strategies exploit machine learning mo...
03/14/2020

Certified Defenses for Adversarial Patches

Adversarial patch attacks are among one of the most practical threat mod...
03/19/2018

When Does Machine Learning FAIL? Generalized Transferability for Evasion and Poisoning Attacks

Attacks against machine learning systems represent a growing threat as h...
10/13/2021

Traceback of Data Poisoning Attacks in Neural Networks

In adversarial machine learning, new defenses against attacks on deep le...
03/19/2020

Backdooring and Poisoning Neural Networks with Image-Scaling Attacks

Backdoors and poisoning attacks are a major threat to the security of ma...
03/26/2018

Clipping free attacks against artificial neural networks

During the last years, a remarkable breakthrough has been made in AI dom...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.