Towards Principled Design of Deep Convolutional Networks: Introducing SimpNet

02/17/2018
by   Seyyed Hossein Hasanpour, et al.
0

Major winning Convolutional Neural Networks (CNNs), such as VGGNet, ResNet, DenseNet, , include tens to hundreds of millions of parameters, which impose considerable computation and memory overheads. This limits their practical usage in training and optimizing for real-world applications. On the contrary, light-weight architectures, such as SqueezeNet, are being proposed to address this issue. However, they mainly suffer from low accuracy, as they have compromised between the processing power and efficiency. These inefficiencies mostly stem from following an ad-hoc designing procedure. In this work, we discuss and propose several crucial design principles for an efficient architecture design and elaborate intuitions concerning different aspects of the design procedure. Furthermore, we introduce a new layer called SAF-pooling to improve the generalization power of the network while keeping it simple by choosing best features. Based on such principles, we propose a simple architecture called SimpNet. We empirically show that SimpNet provides a good trade-off between the computation/memory efficiency and the accuracy solely based on these primitive but crucial principles. SimpNet outperforms the deeper and more complex architectures such as VGGNet, ResNet, WideResidualNet , on several well-known benchmarks, while having 2 to 25 times fewer number of parameters and operations. We obtain state-of-the-art results (in terms of a balance between the accuracy and the number of involved parameters) on standard datasets, such as CIFAR10, CIFAR100, MNIST and SVHN. The implementations are available at urlhttps://github.com/Coderx7/SimpNet.

READ FULL TEXT

page 8

page 15

page 17

page 18

research
08/22/2016

Lets keep it simple, Using simple architectures to outperform deeper and more complex architectures

Major winning Convolutional Neural Networks (CNNs), such as AlexNet, VGG...
research
05/11/2019

Training CNNs with Selective Allocation of Channels

Recent progress in deep convolutional neural networks (CNNs) have enable...
research
05/20/2016

Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups

We propose a new method for creating computationally efficient and compa...
research
03/07/2018

HENet:A Highly Efficient Convolutional Neural Networks Optimized for Accuracy, Speed and Storage

In order to enhance the real-time performance of convolutional neural ne...
research
03/21/2021

MoViNets: Mobile Video Networks for Efficient Video Recognition

We present Mobile Video Networks (MoViNets), a family of computation and...
research
09/28/2017

Improving Efficiency in Convolutional Neural Network with Multilinear Filters

The excellent performance of deep neural networks has enabled us to solv...

Please sign up or login with your details

Forgot password? Click here to reset