Slim Embedding Layers for Recurrent Neural Language Models

11/27/2017
by   Zhongliang Li, et al.
0

Recurrent neural language models are the state-of-the-art models for language modeling. When the vocabulary size is large, the space taken to store the model parameters becomes the bottleneck for the use of recurrent neural language models. In this paper, we introduce a simple space compression method that randomly shares the structured parameters at both the input and output embedding layers of the recurrent neural language models to significantly reduce the size of model parameters, but still compactly represent the original input and output embedding layers. The method is easy to implement and tune. Experiments on several data sets show that the new method can get similar perplexity and BLEU score results while only using a very tiny fraction of parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2017

Input-to-Output Gate to Improve RNN Language Models

This paper proposes a reinforcing method that refines the output layers ...
research
08/20/2016

Using the Output Embedding to Improve Language Models

We study the topmost weight matrix of neural network language models. We...
research
01/27/2019

Variational Smoothing in Recurrent Neural Network Language Models

We present a new theoretical perspective of data noising in recurrent ne...
research
08/16/2015

Online Representation Learning in Recurrent Neural Language Models

We investigate an extension of continuous online learning in recurrent n...
research
02/04/2016

A Factorized Recurrent Neural Network based architecture for medium to large vocabulary Language Modelling

Statistical language models are central to many applications that use se...
research
02/02/2015

Scaling Recurrent Neural Network Language Models

This paper investigates the scaling properties of Recurrent Neural Netwo...
research
05/14/2019

Deep Residual Output Layers for Neural Language Generation

Many tasks, including language generation, benefit from learning the str...

Please sign up or login with your details

Forgot password? Click here to reset