Convolutional Networks with Adaptive Computation Graphs

11/30/2017
by   Andreas Veit, et al.
0

Do convolutional networks really need a fixed feed-forward structure? Often, a neural network is already confident after a few layers about the high-level concept shown in the image. However, due to the fixed network structure, all remaining layers still need to be evaluated. What if the network could jump right to a layer that is specialized in fine-grained differences of the image's content? In this work, we propose Adanets, a family of convolutional networks with adaptive computation graphs. Following a high-level structure similar to residual networks (Resnets), the key difference is that for each layer a gating function determines whether to execute the layer or move on to the next one. In experiments on CIFAR-10 and ImageNet we demonstrate that Adanets efficiently allocate computational budget among layers and learn distinct layers specializing in similar categories. Adanet 50 achieves a top 5 error rate of 7.94 achieves 8.58 the susceptibility towards adversarial examples. We observe that Adanets show a higher robustness towards adversarial attacks, complementing other defenses such as JPEG compression.

READ FULL TEXT

page 5

page 7

research
11/25/2021

Robustness against Adversarial Attacks in Neural Networks using Incremental Dissipativity

Adversarial examples can easily degrade the classification performance i...
research
01/08/2020

Convolutional Networks with Dense Connectivity

Recent work has shown that convolutional networks can be substantially d...
research
05/22/2017

Convolutional Networks with MuxOut Layers as Multi-rate Systems for Image Upscaling

We interpret convolutional networks as adaptive filters and combine them...
research
12/01/2022

Neural Representations Reveal Distinct Modes of Class Fitting in Residual Convolutional Networks

We leverage probabilistic models of neural representations to investigat...
research
07/10/2021

Identifying Layers Susceptible to Adversarial Attacks

Common neural network architectures are susceptible to attack by adversa...
research
06/10/2019

Network Implosion: Effective Model Compression for ResNets via Static Layer Pruning and Retraining

Residual Networks with convolutional layers are widely used in the field...
research
06/09/2022

Predictive Exit: Prediction of Fine-Grained Early Exits for Computation- and Energy-Efficient Inference

By adding exiting layers to the deep learning networks, early exit can t...

Please sign up or login with your details

Forgot password? Click here to reset