Revisiting Activation Regularization for Language RNNs

08/03/2017
by   Stephen Merity, et al.
0

Recurrent neural networks (RNNs) serve as a fundamental building block for many sequence tasks across natural language processing. Recent research has focused on recurrent dropout techniques or custom RNN cells in order to improve performance. Both of these can require substantial modifications to the machine learning model or to the underlying RNN configurations. We revisit traditional regularization techniques, specifically L2 regularization on RNN activations and slowness regularization over successive hidden states, to improve the performance of RNNs on the task of language modeling. Both of these techniques require minimal modification to existing RNN architectures and result in performance improvements comparable or superior to more complicated regularization techniques or custom cell architectures. These regularization techniques can be used without any modification on optimized LSTM implementations such as the NVIDIA cuDNN LSTM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2018

Noisin: Unbiased Regularization for Recurrent Neural Networks

Recurrent neural networks (RNNs) are powerful models of sequential data....
research
11/26/2015

Regularizing RNNs by Stabilizing Activations

We stabilize the activations of Recurrent Neural Networks (RNNs) by pena...
research
07/11/2018

Iterative evaluation of LSTM cells

In this work we present a modification in the conventional flow of infor...
research
07/25/2019

Adaptive Noise Injection: A Structure-Expanding Regularization for RNN

The vanilla LSTM has become one of the most potential architectures in w...
research
09/01/2018

Finding the Answers with Definition Models

Inspired by a previous attempt to answer crossword questions using neura...
research
03/02/2023

DeepSeer: Interactive RNN Explanation and Debugging via State Abstraction

Recurrent Neural Networks (RNNs) have been widely used in Natural Langua...
research
03/20/2023

Investigating Topological Order using Recurrent Neural Networks

Recurrent neural networks (RNNs), originally developed for natural langu...

Please sign up or login with your details

Forgot password? Click here to reset