Efficient Inference on Deep Neural Networks by Dynamic Representations and Decision Gates

11/05/2018
by   Mohammad Saeed Shafiee, et al.
0

The current trade-off between depth and computational cost makes it difficult to adopt deep neural networks for many industrial applications, especially when computing power is limited. Here, we are inspired by the idea that, while deeper embeddings are needed to discriminate difficult samples, a large number of samples can be well discriminated via much shallower embeddings. In this study, we introduce the concept of decision gates (d-gate), modules trained to decide whether a sample needs to be projected into a deeper embedding or if an early prediction can be made at the d-gate, thus enabling the computation of dynamic representations at different depths. The proposed d-gate modules can be integrated with any deep neural network and reduces the average computational cost of the deep neural networks while maintaining modeling accuracy. Experimental results show that leveraging the proposed d-gate modules led to a 38 FLOPS reduction on DenseNet-201 trained on the CIFAR10 dataset with only 2% drop in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2021

Feature-Gate Coupling for Dynamic Network Pruning

Gating modules have been widely explored in dynamic network pruning to r...
research
10/15/2022

Deep Differentiable Logic Gate Networks

Recently, research has increasingly focused on developing efficient neur...
research
02/19/2021

Learning Dynamic BERT via Trainable Gate Variables and a Bi-modal Regularizer

The BERT model has shown significant success on various natural language...
research
09/17/2023

SplitEE: Early Exit in Deep Neural Networks with Split Computing

Deep Neural Networks (DNNs) have drawn attention because of their outsta...
research
01/02/2017

Dynamic Deep Neural Networks: Optimizing Accuracy-Efficiency Trade-offs by Selective Execution

We introduce Dynamic Deep Neural Networks (D2NN), a new type of feed-for...
research
06/12/2019

DeepSquare: Boosting the Learning Power of Deep Convolutional Neural Networks with Elementwise Square Operators

Modern neural network modules which can significantly enhance the learni...
research
04/12/2020

Inception LSTM

In this paper, we proposed a novel deep-learning method called Inception...

Please sign up or login with your details

Forgot password? Click here to reset