Transformed ℓ_1 Regularization for Learning Sparse Deep Neural Networks

01/04/2019
by   Rongrong Ma, et al.
0

Deep neural networks (DNNs) have achieved extraordinary success in numerous areas. However, to attain this success, DNNs often carry a large number of weight parameters, leading to heavy costs of memory and computation resources. Overfitting is also likely to happen in such network when the training data are insufficient. These shortcomings severely hinder the application of DNNs in resource-constrained platforms. In fact, many network weights are known to be redundant and can be removed from the network without much loss of performance. To this end, we introduce a new non-convex integrated transformed ℓ_1 regularizer to promote sparsity for DNNs, which removes both redundant connections and unnecessary neurons simultaneously. To be specific, we apply the transformed ℓ_1 to the matrix space of network weights and utilize it to remove redundant connections. Besides, group sparsity is also employed as an auxiliary to remove unnecessary neurons. An efficient stochastic proximal gradient algorithm is presented to solve the new model at the same time. To the best of our knowledge, this is the first work to utilize a non-convex regularizer in sparse optimization based method to promote sparsity for DNNs. Experiments on several public datasets demonstrate the effectiveness of the proposed method.

READ FULL TEXT
research
04/09/2020

Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

In a deep neural network (DNN), the number of the parameters is usually ...
research
11/01/2020

An Embarrassingly Simple Approach to Training Ternary Weight Networks

Deep neural networks (DNNs) have achieved great successes in various dom...
research
10/18/2017

Stochastic Weighted Function Norm Regularization

Deep neural networks (DNNs) have become increasingly important due to th...
research
09/11/2019

Regularized deep learning with non-convex penalties

Regularization methods are often employed in deep learning neural networ...
research
11/19/2016

Learning the Number of Neurons in Deep Networks

Nowadays, the number of layers and of neurons in each layer of a deep ne...
research
10/01/2018

Dynamic Sparse Graph for Efficient Deep Learning

We propose to execute deep neural networks (DNNs) with dynamic and spars...
research
05/16/2018

Regularization Learning Networks

Despite their impressive performance, Deep Neural Networks (DNNs) typica...

Please sign up or login with your details

Forgot password? Click here to reset