BlockDrop: Dynamic Inference Paths in Residual Networks

11/22/2017
by   Zuxuan Wu, et al.
0

Very deep convolutional neural networks offer excellent recognition results, yet their computational expense limits their impact for many real-world applications. We introduce BlockDrop, an approach that learns to dynamically choose which layers of a deep network to execute during inference so as to best reduce total computation without degrading prediction accuracy. Exploiting the robustness of Residual Networks (ResNets) to layer dropping, our framework selects on-the-fly which residual blocks to evaluate for a given novel image. In particular, given a pretrained ResNet, we train a policy network in an associative reinforcement learning setting for the dual reward of utilizing a minimal number of blocks while preserving recognition accuracy. We conduct extensive experiments on CIFAR and ImageNet. The results provide strong quantitative and qualitative evidence that these learned policies not only accelerate inference but also encode meaningful visual information. Built upon a ResNet-101 model, our method achieves a speedup of 20% on average, going as high as 36% for some images, while maintaining the same 76.4% top-1 accuracy on ImageNet.

READ FULL TEXT

page 1

page 3

page 8

research
10/03/2020

UCP: Uniform Channel Pruning for Deep Convolutional Neural Networks Compression and Acceleration

To apply deep CNNs to mobile terminals and portable devices, many schola...
research
12/01/2019

Diversifying Inference Path Selection: Moving-Mobile-Network for Landmark Recognition

Deep convolutional neural networks have largely benefited computer visio...
research
09/19/2016

Multi-Residual Networks: Improving the Speed and Accuracy of Residual Networks

In this article, we take one step toward understanding the learning beha...
research
10/30/2017

CrescendoNet: A Simple Deep Convolutional Neural Network with Ensemble Behavior

We introduce a new deep convolutional neural network, CrescendoNet, by s...
research
02/25/2017

Adaptive Neural Networks for Efficient Inference

We present an approach to adaptively utilize deep neural networks in ord...
research
01/29/2023

Scaling in Depth: Unlocking Robustness Certification on ImageNet

Notwithstanding the promise of Lipschitz-based approaches to determinist...
research
12/05/2016

Deep Pyramidal Residual Networks with Separated Stochastic Depth

On general object recognition, Deep Convolutional Neural Networks (DCNNs...

Please sign up or login with your details

Forgot password? Click here to reset