DeepSquare: Boosting the Learning Power of Deep Convolutional Neural Networks with Elementwise Square Operators

06/12/2019
by   Sheng Chen, et al.
0

Modern neural network modules which can significantly enhance the learning power usually add too much computational complexity to the original neural networks. In this paper, we pursue very efficient neural network modules which can significantly boost the learning power of deep convolutional neural networks with negligible extra computational cost. We first present both theoretically and experimentally that elementwise square operator has a potential to enhance the learning power of neural networks. Then, we design four types of lightweight modules with elementwise square operators, named as Square-Pooling, Square-Softmin, Square-Excitation, and Square-Encoding. We add our four lightweight modules to Resnet18, Resnet50, and ShuffleNetV2 for better performance in the experiment on ImageNet 2012 dataset. The experimental results show that our modules can bring significant accuracy improvements to the base convolutional neural network models. The performance of our lightweight modules is even comparable to many complicated modules such as bilinear pooling, Squeeze-and-Excitation, and Gather-Excite. Our highly efficient modules are particularly suitable for mobile models. For example, when equipped with a single Square-Pooling module, the top-1 classification accuracy of ShuffleNetV2-0.5x on ImageNet 2012 is absolutely improved by 1.45 with no additional parameters and negligible inference time overhead.

READ FULL TEXT
research
08/07/2023

Optimal Approximation and Learning Rates for Deep Convolutional Neural Networks

This paper focuses on approximation and learning performance analysis fo...
research
07/04/2018

Selective Deep Convolutional Neural Network for Low Cost Distorted Image Classification

Deep convolutional neural networks have proven to be well suited for ima...
research
01/24/2023

Progressive Meta-Pooling Learning for Lightweight Image Classification Model

Practical networks for edge devices adopt shallow depth and small convol...
research
01/28/2019

Squeezed Very Deep Convolutional Neural Networks for Text Classification

Most of the research in convolutional neural networks has focused on inc...
research
10/29/2018

Gather-Excite: Exploiting Feature Context in Convolutional Neural Networks

While the use of bottom-up local operators in convolutional neural netwo...
research
04/21/2016

TI-POOLING: transformation-invariant pooling for feature learning in Convolutional Neural Networks

In this paper we present a deep neural network topology that incorporate...
research
11/05/2018

Efficient Inference on Deep Neural Networks by Dynamic Representations and Decision Gates

The current trade-off between depth and computational cost makes it diff...

Please sign up or login with your details

Forgot password? Click here to reset