Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling

11/04/2016
by   Hakan Inan, et al.
0

Recurrent neural networks have been very successful at predicting sequences of words in tasks such as language modeling. However, all such models are based on the conventional classification framework, where the model is trained against one-hot targets, and each word is represented both as an input and as an output in isolation. This causes inefficiencies in learning both in terms of utilizing all of the information and in terms of the number of parameters needed to train. We introduce a novel theoretical framework that facilitates better learning in language modeling, and show that our framework leads to tying together the input embedding and the output projection matrices, greatly reducing the number of trainable variables. Our framework leads to state of the art performance on the Penn Treebank with a variety of network models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2018

Adaptive Input Representations for Neural Language Modeling

We introduce adaptive input representations for neural language modeling...
research
04/26/2019

Think Again Networks, the Delta Loss, and an Application in Language Modeling

This short paper introduces an abstraction called Think Again Networks (...
research
08/24/2017

A Study on Neural Network Language Modeling

An exhaustive study on neural network language modeling (NNLM) is perfor...
research
12/22/2021

The Importance of the Current Input in Sequence Modeling

The last advances in sequence modeling are mainly based on deep learning...
research
01/14/2020

Block-wise Dynamic Sparseness

Neural networks have achieved state of the art performance across a wide...
research
08/27/2018

Predefined Sparseness in Recurrent Sequence Models

Inducing sparseness while training neural networks has been shown to yie...
research
08/14/2018

Improved Language Modeling by Decoding the Past

Highly regularized LSTMs that model the auto-regressive conditional fact...

Please sign up or login with your details

Forgot password? Click here to reset