Strike (with) a Pose: Neural Networks Are Easily Fooled by Strange Poses of Familiar Objects

11/28/2018
by   Michael A. Alcorn, et al.
19

Despite excellent performance on stationary test sets, deep neural networks (DNNs) can fail to generalize to out-of-distribution (OoD) inputs, including natural, non-adversarial ones, which are common in real-world settings. In this paper, we present a framework for discovering DNN failures that harnesses 3D renderers and 3D models. That is, we estimate the parameters of a 3D renderer that cause a target DNN to misbehave in response to the rendered image. Using our framework and a self-assembled dataset of 3D objects, we investigate the vulnerability of DNNs to OoD poses of well-known objects in ImageNet. For objects that are readily recognized by DNNs in their canonical poses, DNNs incorrectly classify 97 sensitive to slight pose perturbations. Importantly, adversarial poses transfer across models and datasets. We find that 99.9 misclassified by Inception-v3 also transfer to the AlexNet and ResNet-50 image classifiers trained on the same ImageNet dataset, respectively, and 75.5 transfer to the YOLOv3 object detector trained on MS COCO.

READ FULL TEXT

page 18

page 19

page 20

page 21

page 28

page 30

page 31

page 34

research
12/16/2019

DAmageNet: A Universal Adversarial Dataset

It is now well known that deep neural networks (DNNs) are vulnerable to ...
research
09/12/2017

Can Deep Neural Networks Match the Related Objects?: A Survey on ImageNet-trained Classification Models

Deep neural networks (DNNs) have shown the state-of-the-art level of per...
research
12/05/2014

Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images

Deep neural networks (DNNs) have recently been achieving state-of-the-ar...
research
11/19/2018

Explain to Fix: A Framework to Interpret and Correct DNN Object Detector Predictions

Explaining predictions of deep neural networks (DNNs) is an important an...
research
08/29/2022

Data Isotopes for Data Provenance in DNNs

Today, creators of data-hungry deep neural networks (DNNs) scour the Int...
research
03/17/2023

Deephys: Deep Electrophysiology, Debugging Neural Networks under Distribution Shifts

Deep Neural Networks (DNNs) often fail in out-of-distribution scenarios....
research
12/07/2020

Sparse Fooling Images: Fooling Machine Perception through Unrecognizable Images

In recent years, deep neural networks (DNNs) have achieved equivalent or...

Please sign up or login with your details

Forgot password? Click here to reset