Predefined Sparseness in Recurrent Sequence Models

08/27/2018
by   Thomas Demeester, et al.
0

Inducing sparseness while training neural networks has been shown to yield models with a lower memory footprint but similar effectiveness to dense models. However, sparseness is typically induced starting from a dense model, and thus this advantage does not hold during training. We propose techniques to enforce sparseness upfront in recurrent sequence models for NLP applications, to also benefit training. First, in language modeling, we show how to increase hidden state sizes in recurrent layers without increasing the number of parameters, leading to more expressive models. Second, for sequence labeling, we show that word embeddings with predefined sparseness lead to similar performance as dense embeddings, at a fraction of the number of trainable parameters.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2016

Compressing Neural Language Models by Sparse Word Representations

Neural networks are among the state-of-the-art techniques for language m...
research
10/12/2016

Language Models with Pre-Trained (GloVe) Word Embeddings

In this work we implement a training of a Language Model (LM), using Rec...
research
02/07/2016

Exploring the Limits of Language Modeling

In this work we explore recent advances in Recurrent Neural Networks for...
research
11/04/2016

Tying Word Vectors and Word Classifiers: A Loss Framework for Language Modeling

Recurrent neural networks have been very successful at predicting sequen...
research
01/14/2020

Block-wise Dynamic Sparseness

Neural networks have achieved state of the art performance across a wide...
research
07/28/2019

Representation Degeneration Problem in Training Natural Language Generation Models

We study an interesting problem in training neural network-based models ...
research
07/21/2023

On the Universality of Linear Recurrences Followed by Nonlinear Projections

In this note (work in progress towards a full-length paper) we show that...

Please sign up or login with your details

Forgot password? Click here to reset