What Causes Optical Flow Networks to be Vulnerable to Physical Adversarial Attacks

03/30/2021
by   Simon Schrodi, et al.
0

Recent work demonstrated the lack of robustness of optical flow networks to physical, patch-based adversarial attacks. The possibility to physically attack a basic component of automotive systems is a reason for serious concerns. In this paper, we analyze the cause of the problem and show that the lack of robustness is rooted in the classical aperture problem of optical flow estimation in combination with bad choices in the details of the network architecture. We show how these mistakes can be rectified in order to make optical flow networks robust to physical, patch-based attacks.

READ FULL TEXT

page 3

page 4

page 6

page 7

page 12

page 14

page 16

page 17

research
10/22/2019

Attacking Optical Flow

Deep neural nets achieve state-of-the-art performance on the problem of ...
research
03/24/2022

A Perturbation Constrained Adversarial Attack for Evaluating the Robustness of Optical Flow

Recent optical flow methods are almost exclusively judged in terms of ac...
research
11/16/2021

Consistent Semantic Attacks on Optical Flow

We present a novel approach for semantically targeted adversarial attack...
research
04/11/2016

Beyond Brightness Constancy: Learning Noise Models for Optical Flow

Optical flow is typically estimated by minimizing a "data cost" and an o...
research
10/20/2022

Attacking Motion Estimation with Adversarial Snow

Current adversarial attacks for motion estimation (optical flow) optimiz...
research
05/11/2023

Distracting Downpour: Adversarial Weather Attacks for Motion Estimation

Current adversarial attacks on motion estimation, or optical flow, optim...
research
06/28/2019

Robustness Guarantees for Deep Neural Networks on Videos

The widespread adoption of deep learning models places demands on their ...

Please sign up or login with your details

Forgot password? Click here to reset