Automatic Node Selection for Deep Neural Networks using Group Lasso Regularization

11/17/2016
by   Tsubasa Ochiai, et al.
0

We examine the effect of the Group Lasso (gLasso) regularizer in selecting the salient nodes of Deep Neural Network (DNN) hidden layers by applying a DNN-HMM hybrid speech recognizer to TED Talks speech data. We test two types of gLasso regularization, one for outgoing weight vectors and another for incoming weight vectors, as well as two sizes of DNNs: 2048 hidden layer nodes and 4096 nodes. Furthermore, we compare gLasso and L2 regularizers. Our experiment results demonstrate that our DNN training, in which the gLasso regularizer was embedded, successfully selected the hidden layer nodes that are necessary and sufficient for achieving high classification power.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2017

Deep Triphone Embedding Improves Phoneme Recognition

In this paper, we present a novel Deep Triphone Embedding (DTE) represen...
research
04/10/2016

Visualization Regularizers for Neural Network based Image Recognition

The success of deep neural networks is mostly due their ability to learn...
research
02/18/2015

F0 Modeling In Hmm-Based Speech Synthesis System Using Deep Belief Network

In recent years multilayer perceptrons (MLPs) with many hid- den layers ...
research
07/06/2018

Sparse Deep Neural Network Exact Solutions

Deep neural networks (DNNs) have emerged as key enablers of machine lear...
research
10/09/2019

Deep neural network for pier scour prediction

With the advancement in computing power over last decades, deep neural n...
research
11/21/2019

Approximated Orthonormal Normalisation in Training Neural Networks

Generalisation of a deep neural network (DNN) is one major concern when ...
research
11/02/2022

SIMD-size aware weight regularization for fast neural vocoding on CPU

This paper proposes weight regularization for a faster neural vocoder. P...

Please sign up or login with your details

Forgot password? Click here to reset