TransfoRNN: Capturing the Sequential Information in Self-Attention Representations for Language Modeling

04/04/2021
by   Tze Yuang Chong, et al.
16

In this paper, we describe the use of recurrent neural networks to capture sequential information from the self-attention representations to improve the Transformers. Although self-attention mechanism provides a means to exploit long context, the sequential information, i.e. the arrangement of tokens, is not explicitly captured. We propose to cascade the recurrent neural networks to the Transformers, which referred to as the TransfoRNN model, to capture the sequential information. We found that the TransfoRNN models which consists of only shallow Transformers stack is suffice to give comparable, if not better, performance than a deeper Transformer model. Evaluated on the Penn Treebank and WikiText-2 corpora, the proposed TransfoRNN model has shown lower model perplexities with fewer number of model parameters. On the Penn Treebank corpus, the model perplexities were reduced up to 5.5 reduced up to 10.5 up to 2.2 on the LibriSpeech speech recognition task and has shown comparable results with the Transformer models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2019

Language Modeling with Deep Transformers

We explore multi-layer autoregressive Transformer models in language mod...
research
06/08/2021

Staircase Attention for Recurrent Processing of Sequences

Attention mechanisms have become a standard tool for sequence modeling t...
research
12/29/2022

Unsupervised construction of representations for oil wells via Transformers

Determining and predicting reservoir formation properties for newly dril...
research
04/17/2019

Dynamic Evaluation of Transformer Language Models

This research note combines two methods that have recently improved the ...
research
11/10/2019

Improving Transformer Models by Reordering their Sublayers

Multilayer transformer networks consist of interleaved self-attention an...
research
07/05/2019

A Bi-directional Transformer for Musical Chord Recognition

Chord recognition is an important task since chords are highly abstract ...
research
07/24/2022

A Cognitive Study on Semantic Similarity Analysis of Large Corpora: A Transformer-based Approach

Semantic similarity analysis and modeling is a fundamentally acclaimed t...

Please sign up or login with your details

Forgot password? Click here to reset