DeepAI
Log In Sign Up

A Comparison of Neural Network Training Methods for Text Classification

10/28/2019
by   Anderson de Andrade, et al.
0

We study the impact of neural networks in text classification. Our focus is on training deep neural networks with proper weight initialization and greedy layer-wise pretraining. Results are compared with 1-layer neural networks and Support Vector Machines. We work with a dataset of labeled messages from the Twitter microblogging service and aim to predict weather conditions. A feature extraction procedure specific for the task is proposed, which applies dimensionality reduction using Latent Semantic Analysis. Our results show that neural networks outperform Support Vector Machines with Gaussian kernels, noticing performance gains from introducing additional hidden layers with nonlinearities. The impact of using Nesterov's Accelerated Gradient in backpropagation is also studied. We conclude that deep neural networks are a reasonable approach for text classification and propose further ideas to improve performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/08/2019

Development of email classifier in Brazilian Portuguese using feature selection for automatic response

Automatic email categorization is an important application of text class...
05/20/2005

Wavelet Time Shift Properties Integration with Support Vector Machines

This paper presents a short evaluation about the integration of informat...
08/27/2020

Automatic Speech Summarisation: A Scoping Review

Speech summarisation techniques take human speech as input and then outp...
11/05/2019

Guided Layer-wise Learning for Deep Models using Side Information

Training of deep models for classification tasks is hindered by local mi...
05/03/2015

Making Sense of Hidden Layer Information in Deep Networks by Learning Hierarchical Targets

This paper proposes an architecture for deep neural networks with hidden...