Effectiveness of Self Normalizing Neural Networks for Text Classification

05/03/2019
by   Avinash Madasu, et al.
0

Self Normalizing Neural Networks(SNN) proposed on Feed Forward Neural Networks(FNN) outperform regular FNN architectures in various machine learning tasks. Particularly in the domain of Computer Vision, the activation function Scaled Exponential Linear Units (SELU) proposed for SNNs, perform better than other non linear activations such as ReLU. The goal of SNN is to produce a normalized output for a normalized input. Established neural network architectures like feed forward networks and Convolutional Neural Networks(CNN) lack the intrinsic nature of normalizing outputs. Hence, requiring additional layers such as Batch Normalization. Despite the success of SNNs, their characteristic features on other network architectures like CNN haven't been explored, especially in the domain of Natural Language Processing. In this paper we aim to show the effectiveness of proposed, Self Normalizing Convolutional Neural Networks(SCNN) on text classification. We analyze their performance with the standard CNN architecture used on several text classification datasets. Our experiments demonstrate that SCNN achieves comparable results to standard CNN model with significantly fewer parameters. Furthermore it also outperforms CNN with equal number of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2019

Sequential Learning of Convolutional Features for Effective Text Classification

Text classification has been one of the major problems in natural langua...
research
05/18/2021

Self-interpretable Convolutional Neural Networks for Text Classification

Deep learning models for natural language processing (NLP) are inherentl...
research
06/08/2017

Self-Normalizing Neural Networks

Deep Learning has revolutionized vision via convolutional neural network...
research
08/01/2015

Towards Distortion-Predictable Embedding of Neural Networks

Current research in Computer Vision has shown that Convolutional Neural ...
research
03/26/2019

Musical Tempo and Key Estimation using Convolutional Neural Networks with Directional Filters

In this article we explore how the different semantics of spectrograms' ...
research
09/14/2019

Temporal FiLM: Capturing Long-Range Sequence Dependencies with Feature-Wise Modulations

Learning representations that accurately capture long-range dependencies...
research
05/22/2018

Breaking the Activation Function Bottleneck through Adaptive Parameterization

Standard neural network architectures are non-linear only by virtue of a...

Please sign up or login with your details

Forgot password? Click here to reset