On improving deep learning generalization with adaptive sparse connectivity

06/27/2019
by   Shiwei Liu, et al.
0

Large neural networks are very successful in various tasks. However, with limited data, the generalization capabilities of deep neural networks are also very limited. In this paper, we empirically start showing that intrinsically sparse neural networks with adaptive sparse connectivity, which by design have a strict parameter budget during the training phase, have better generalization capabilities than their fully-connected counterparts. Besides this, we propose a new technique to train these sparse models by combining the Sparse Evolutionary Training (SET) procedure with neurons pruning. Operated on MultiLayer Perceptron (MLP) and tested on 15 datasets, our proposed technique zeros out around 50 linear number of parameters to optimize with respect to the number of neurons. The results show a competitive classification and generalization performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2019

Evolving and Understanding Sparse Deep Neural Networks using Cosine Similarity

Training sparse neural networks with adaptive connectivity is an active ...
research
11/29/2020

Improving Neural Network with Uniform Sparse Connectivity

Neural network forms the foundation of deep learning and numerous AI app...
research
06/24/2020

Topological Insights in Sparse Neural Networks

Sparse neural networks are effective approaches to reduce the resource r...
research
03/28/2016

Sparse Activity and Sparse Connectivity in Supervised Learning

Sparseness is a useful regularizer for learning in a wide range of appli...
research
02/08/2022

EvoPruneDeepTL: An Evolutionary Pruning Model for Transfer Learning based Deep Neural Networks

In recent years, Deep Learning models have shown a great performance in ...
research
01/26/2019

Sparse evolutionary Deep Learning with over one million artificial neurons on commodity hardware

Microarray gene expression has widely attracted the eyes of the public a...
research
11/07/2019

Sparsity through evolutionary pruning prevents neuronal networks from overfitting

Modern Machine learning techniques take advantage of the exponentially r...

Please sign up or login with your details

Forgot password? Click here to reset