Efficient Sparse Artificial Neural Networks

03/13/2021
by   Seyed Majid Naji, et al.
0

The brain, as the source of inspiration for Artificial Neural Networks (ANN), is based on a sparse structure. This sparse structure helps the brain to consume less energy, learn easier and generalize patterns better than any other ANN. In this paper, two evolutionary methods for adopting sparsity to ANNs are proposed. In the proposed methods, the sparse structure of a network as well as the values of its parameters are trained and updated during the learning process. The simulation results show that these two methods have better accuracy and faster convergence while they need fewer training samples compared to their sparse and non-sparse counterparts. Furthermore, the proposed methods significantly improve the generalization power and reduce the number of parameters. For example, the sparsification of the ResNet47 network by exploiting our proposed methods for the image classification of ImageNet dataset uses 40 improves by 12 counterpart, respectively. As another example, the proposed methods for the CIFAR10 dataset converge to their final structure 7 times faster than its sparse counterpart, while the final accuracy increases by 6

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2017

Evolutionary Training of Sparse Artificial Neural Networks: A Network Science Perspective

Through the success of deep learning, Artificial Neural Networks (ANNs) ...
research
09/13/2022

Sparse deep neural networks for modeling aluminum electrolysis dynamics

Artificial neural networks have a broad array of applications today due ...
research
05/04/2018

Enhancing the Regularization Effect of Weight Pruning in Artificial Neural Networks

Artificial neural networks (ANNs) may not be worth their computational/m...
research
06/03/2023

Transforming to Yoked Neural Networks to Improve ANN Structure

Most existing classical artificial neural networks (ANN) are designed as...
research
10/26/2020

BayCANN: Streamlining Bayesian Calibration with Artificial Neural Network Metamodeling

Purpose: Bayesian calibration is theoretically superior to standard dire...
research
03/02/2023

Large Deviations for Accelerating Neural Networks Training

Artificial neural networks (ANNs) require tremendous amount of data to t...
research
03/02/2022

Rethinking Pretraining as a Bridge from ANNs to SNNs

Spiking neural networks (SNNs) are known as a typical kind of brain-insp...

Please sign up or login with your details

Forgot password? Click here to reset