Anytime Neural Prediction via Slicing Networks Vertically

07/07/2018
by   Hankook Lee, et al.
0

The pioneer deep neural networks (DNNs) have emerged to be deeper or wider for improving their accuracy in various applications of artificial intelligence. However, DNNs are often too heavy to deploy in practice, and it is often required to control their architectures dynamically given computing resource budget, i.e., anytime prediction. While most existing approaches have focused on training multiple shallow sub-networks jointly, we study training thin sub-networks instead. To this end, we first build many inclusive thin sub-networks (of the same depth) under a minor modification of existing multi-branch DNNs, and found that they can significantly outperform the state-of-art dense architecture for anytime prediction. This is remarkable due to their simplicity and effectiveness, but training many thin sub-networks jointly faces a new challenge on training complexity. To address the issue, we also propose a novel DNN architecture by forcing a certain sparsity pattern on multi-branch network parameters, making them train efficiently for the purpose of anytime prediction. In our experiments on the ImageNet dataset, its sub-networks have up to 43.3% smaller sizes (FLOPs) compared to those of the state-of-art anytime model with respect to the same accuracy. Finally, we also propose an alternative task under the proposed architecture using a hierarchical taxonomy, which brings a new angle for anytime prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2018

Hierarchical Block Sparse Neural Networks

Sparse deep neural networks(DNNs) are efficient in both memory and compu...
research
02/24/2016

SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size

Recent research on deep neural networks has focused primarily on improvi...
research
06/07/2019

AutoGrow: Automatic Layer Growing in Deep Convolutional Networks

We propose AutoGrow to automate depth discovery in Deep Neural Networks ...
research
10/10/2016

Impatient DNNs - Deep Neural Networks with Dynamic Time Budgets

We propose Impatient Deep Neural Networks (DNNs) which deal with dynamic...
research
02/03/2019

An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity

Regularization of Deep Neural Networks (DNNs) for the sake of improving ...
research
02/01/2018

Deep Learning with Data Dependent Implicit Activation Function

Though deep neural networks (DNNs) achieve remarkable performances in ma...
research
04/12/2023

SGL: Structure Guidance Learning for Camera Localization

Camera localization is a classical computer vision task that serves vari...

Please sign up or login with your details

Forgot password? Click here to reset