Ising-Dropout: A Regularization Method for Training and Compression of Deep Neural Networks

02/07/2019
by   Hojjat Salehinejad, et al.
0

Overfitting is a major problem in training machine learning models, specifically deep neural networks. This problem may be caused by imbalanced datasets and initialization of the model parameters, which conforms the model too closely to the training data and negatively affects the generalization performance of the model for unseen data. The original dropout is a regularization technique to drop hidden units randomly during training. In this paper, we propose an adaptive technique to wisely drop the visible and hidden units in a deep neural network using Ising energy of the network. The preliminary results show that the proposed approach can keep the classification performance competitive to the original network while eliminating optimization of unnecessary network parameters in each training cycle. The dropout state of units can also be applied to the trained (inference) model. This technique could compress the network in terms of number of parameters up to 41.18 55.86 respectively.

READ FULL TEXT
research
08/29/2018

Dropout with Tabu Strategy for Regularizing Deep Neural Networks

Dropout has proven to be an effective technique for regularization and p...
research
04/21/2018

Bridgeout: stochastic bridge regularization for deep neural networks

A major challenge in training deep neural networks is overfitting, i.e. ...
research
11/23/2017

Regularization of Deep Neural Networks with Spectral Dropout

The big breakthrough on the ImageNet challenge in 2012 was partially due...
research
11/19/2015

Reducing Overfitting in Deep Networks by Decorrelating Representations

One major challenge in training Deep Neural Networks is preventing overf...
research
10/14/2017

Regularizing Deep Neural Networks by Noise: Its Interpretation and Optimization

Overfitting is one of the most critical challenges in deep neural networ...
research
08/14/2021

Investigating the Relationship Between Dropout Regularization and Model Complexity in Neural Networks

Dropout Regularization, serving to reduce variance, is nearly ubiquitous...

Please sign up or login with your details

Forgot password? Click here to reset