Neutron: An Implementation of the Transformer Translation Model and its Variants

03/18/2019
by   Hongfei Xu, et al.
0

The Transformer translation model is easier to parallelize and provides better performance comparing with recurrent seq2seq models, which makes it popular among industry and research community. We implement Neutron in this work, including the Transformer model and several variants from most recent researches. It is easier to modify and provides comparable performance with interesting features while keep readability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2021

The NiuTrans Machine Translation Systems for WMT21

This paper describes NiuTrans neural machine translation systems of the ...
research
11/07/2020

Rethinking the Value of Transformer Components

Transformer becomes the state-of-the-art translation model, while it is ...
research
02/23/2021

Do Transformer Modifications Transfer Across Implementations and Applications?

The research community has proposed copious modifications to the Transfo...
research
05/10/2023

Multi-Path Transformer is Better: A Case Study on Neural Machine Translation

For years the model performance in machine learning obeyed a power-law r...
research
04/05/2019

Modeling Recurrence for Transformer

Recently, the Transformer model that is based solely on attention mechan...
research
03/14/2019

Notation for Subject Answer Analysis

It is believed that consistent notation helps the research community in ...
research
03/14/2019

A Deep Patent Landscaping Model using Transformer and Graph Embedding

Patent landscaping is a method that is employed for searching related pa...

Please sign up or login with your details

Forgot password? Click here to reset