An Augmented Transformer Architecture for Natural Language Generation Tasks

10/30/2019
by   Hailiang Li, et al.
0

The Transformer based neural networks have been showing significant advantages on most evaluations of various natural language processing and other sequence-to-sequence tasks due to its inherent architecture based superiorities. Although the main architecture of the Transformer has been continuously being explored, little attention was paid to the positional encoding module. In this paper, we enhance the sinusoidal positional encoding algorithm by maximizing the variances between encoded consecutive positions to obtain additional promotion. Furthermore, we propose an augmented Transformer architecture encoded with additional linguistic knowledge, such as the Part-of-Speech (POS) tagging, to boost the performance on some natural language generation tasks, e.g., the automatic translation and summarization tasks. Experiments show that the proposed architecture attains constantly superior results compared to the vanilla Transformer.

READ FULL TEXT

page 1

page 6

research
04/03/2023

GreekBART: The First Pretrained Greek Sequence-to-Sequence Model

The era of transfer learning has revolutionized the fields of Computer V...
research
01/20/2020

Multi-level Head-wise Match and Aggregation in Transformer for Textual Sequence Matching

Transformer has been successfully applied to many natural language proce...
research
10/16/2019

Injecting Hierarchy with U-Net Transformers

The Transformer architecture has become increasingly popular over the pa...
research
09/13/2019

A Comparative Study on Transformer vs RNN in Speech Applications

Sequence-to-sequence models have been widely used in end-to-end speech p...
research
12/17/2020

A Generalization of Transformer Networks to Graphs

We propose a generalization of transformer neural network architecture f...
research
02/23/2021

Do Transformer Modifications Transfer Across Implementations and Applications?

The research community has proposed copious modifications to the Transfo...
research
07/14/2022

Forming Trees with Treeformers

Popular models such as Transformers and LSTMs use tokens as its unit of ...

Please sign up or login with your details

Forgot password? Click here to reset