DeepConsensus: using the consensus of features from multiple layers to attain robust image classification

11/18/2018
by   Yuchen Li, et al.
0

We consider a classifier whose test set is exposed to various perturbations that are not present in the training set. These test samples still contain enough features to map them to the same class as their unperturbed counterpart. Current architectures exhibit rapid degradation of accuracy when trained on standard datasets but then used to classify perturbed samples of that data. To address this, we present a novel architecture named DeepConsensus that significantly improves generalization to these test-time perturbations. Our key insight is that deep neural networks should directly consider summaries of low and high level features when making classifications. Existing convolutional neural networks can be augmented with DeepConsensus, leading to improved resistance against large and small perturbations on MNIST, EMNIST, FashionMNIST, CIFAR10 and SVHN datasets.

READ FULL TEXT

page 4

page 8

research
12/01/2020

Adversarial Robustness Across Representation Spaces

Adversarial robustness corresponds to the susceptibility of deep neural ...
research
09/21/2020

Stereopagnosia: Fooling Stereo Networks with Adversarial Perturbations

We study the effect of adversarial perturbations of images on the estima...
research
10/17/2022

Test-Time Training for Graph Neural Networks

Graph Neural Networks (GNNs) have made tremendous progress in the graph ...
research
03/19/2020

Overinterpretation reveals image classification model pathologies

Image classifiers are typically scored on their test set accuracy, but h...
research
05/30/2023

What Can We Learn from Unlearnable Datasets?

In an era of widespread web scraping, unlearnable dataset methods have t...
research
11/30/2017

Measuring the tendency of CNNs to Learn Surface Statistical Regularities

Deep CNNs are known to exhibit the following peculiarity: on the one han...
research
06/14/2016

DCNNs on a Diet: Sampling Strategies for Reducing the Training Set Size

Large-scale supervised classification algorithms, especially those based...

Please sign up or login with your details

Forgot password? Click here to reset