Character-Aware Neural Language Models

08/26/2015
by   Yoon Kim, et al.
0

We describe a simple neural language model that relies only on character-level inputs. Predictions are still made at the word-level. Our model employs a convolutional neural network (CNN) and a highway network over characters, whose output is given to a long short-term memory (LSTM) recurrent neural network language model (RNN-LM). On the English Penn Treebank the model is on par with the existing state-of-the-art despite having 60 parameters. On languages with rich morphology (Arabic, Czech, French, German, Spanish, Russian), the model outperforms word-level/morpheme-level LSTM baselines, again with fewer parameters. The results suggest that on many languages, character inputs are sufficient for language modeling. Analysis of word representations obtained from the character composition part of the model reveals that the model is able to encode, from characters only, both semantic and orthographic information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2016

Gated Word-Character Recurrent Language Model

We introduce a recurrent neural network language model (RNN-LM) with lon...
research
09/02/2017

Patterns versus Characters in Subword-aware Neural Language Modeling

Words in some natural languages can have a composite structure. Elements...
research
04/10/2019

Detecting Cybersecurity Events from Noisy Short Text

It is very critical to analyze messages shared over social networks for ...
research
02/08/2017

Character-level Deep Conflation for Business Data Analytics

Connecting different text attributes associated with the same entity (co...
research
11/19/2015

Alternative structures for character-level RNNs

Recurrent neural networks are convenient and efficient models for langua...
research
11/22/2019

Learning Multi-level Dependencies for Robust Word Recognition

Robust language processing systems are becoming increasingly important g...
research
08/18/2017

Syllable-level Neural Language Model for Agglutinative Language

Language models for agglutinative languages have always been hindered in...

Please sign up or login with your details

Forgot password? Click here to reset