Lightweight Residual Densely Connected Convolutional Neural Network

01/02/2020
by   Fahimeh Fooladgar, et al.
0

Extremely efficient convolutional neural network architectures are one of the most important requirements for limited computing power devices (such as embedded and mobile devices). Recently, some architectures have been proposed to overcome this limitation by considering specific hardware-software equipment. In this paper, the residual densely connected blocks are proposed to guaranty the deep supervision, efficient gradient flow, and feature reuse abilities of convolutional neural network. The proposed method decreases the cost of training and inference processes without using any special hardware-software equipment by just reducing the number of parameters and computational operations while achieving a feasible accuracy. Extensive experimental results demonstrate that the proposed architecture is more efficient than the AlexNet and VGGNet in terms of model size, required parameters, and even accuracy. The proposed model is evaluated on the ImageNet, MNIST, Fashion MNIST, SVHN, CIFAR-10, and CIFAR-100. It achieves state-of-the-art results on the Fashion MNIST dataset and reasonable results on the others. The obtained results show that the proposed model is superior to efficient models such as the SqueezNet and is also comparable with the state-of-the-art efficient models such as CondenseNet and ShuffleNet.

READ FULL TEXT
research
07/04/2017

ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices

We introduce an extremely computation-efficient CNN architecture named S...
research
01/06/2021

LightLayers: Parameter Efficient Dense and Convolutional Layers for Image Classification

Deep Neural Networks (DNNs) have become the de-facto standard in compute...
research
12/15/2017

Reducing Deep Network Complexity with Fourier Transform Methods

We propose a novel way that uses shallow densely connected neuron networ...
research
02/10/2023

Element-Wise Attention Layers: an option for optimization

The use of Attention Layers has become a trend since the popularization ...
research
11/11/2022

RepGhost: A Hardware-Efficient Ghost Module via Re-parameterization

Feature reuse has been a key technique in light-weight convolutional neu...
research
09/16/2020

EfficientNet-eLite: Extremely Lightweight and Efficient CNN Models for Edge Devices by Network Candidate Search

Embedding Convolutional Neural Network (CNN) into edge devices for infer...
research
07/25/2018

Conditional Information Gain Networks

Deep neural network models owe their representational power to the high ...

Please sign up or login with your details

Forgot password? Click here to reset