SpinalNet: Deep Neural Network with Gradual Input

07/07/2020
by   H M Dipu Kabir, et al.
0

Over the past few years, deep neural networks (DNNs) have garnered remarkable success in a diverse range of real-world applications. However, DNNs consider a large number of inputs and consist of a large number of parameters, resulting in high computational demand. We study the human somatosensory system and propose the SpinalNet to achieve higher accuracy with less computational resources. In a typical neural network (NN) architecture, the hidden layers receive inputs in the first layer and then transfer the intermediate outcomes to the next layer. In the proposed SpinalNet, the structure of hidden layers allocates to three sectors: 1) Input row, 2) Intermediate row, and 3) output row. The intermediate row of the SpinalNet contains a few neurons. The role of input segmentation is in enabling each hidden layer to receive a part of the inputs and outputs of the previous layer. Therefore, the number of incoming weights in a hidden layer is significantly lower than traditional DNNs. As all layers of the SpinalNet directly contributes to the output row, the vanishing gradient problem does not exist. We also investigate the SpinalNet fully-connected layer to several well-known DNN models and perform traditional learning and transfer learning. We observe significant error reductions with lower computational costs in most of the DNNs. We have also obtained the state-of-the-art (SOTA) performance for QMNIST, Kuzushiji-MNIST, EMNIST (Letters, Digits, and Balanced), STL-10, Bird225, Fruits 360, and Caltech-101 datasets. The scripts of the proposed SpinalNet are available with the following link: https://github.com/dipuk0506/SpinalNet

READ FULL TEXT
research
05/03/2023

Morphological Classification of Galaxies Using SpinalNet

Deep neural networks (DNNs) with a step-by-step introduction of inputs, ...
research
03/21/2021

ProgressiveSpinalNet architecture for FC layers

In deeplearning models the FC (fully connected) layer has biggest import...
research
03/01/2022

Layer Adaptive Deep Neural Networks for Out-of-distribution Detection

During the forward pass of Deep Neural Networks (DNNs), inputs gradually...
research
07/13/2021

The Foes of Neural Network's Data Efficiency Among Unnecessary Input Dimensions

Datasets often contain input dimensions that are unnecessary to predict ...
research
09/06/2023

Adaptive Growth: Real-time CNN Layer Expansion

Deep Neural Networks (DNNs) have shown unparalleled achievements in nume...
research
09/25/2017

Generative learning for deep networks

Learning, taking into account full distribution of the data, referred to...
research
01/03/2023

WLD-Reg: A Data-dependent Within-layer Diversity Regularizer

Neural networks are composed of multiple layers arranged in a hierarchic...

Please sign up or login with your details

Forgot password? Click here to reset