Echo State Neural Machine Translation

02/27/2020
by   Ankush Garg, et al.
0

We present neural machine translation (NMT) models inspired by echo state network (ESN), named Echo State NMT (ESNMT), in which the encoder and decoder layer weights are randomly generated then fixed throughout training. We show that even with this extremely simple model construction and training procedure, ESNMT can already reach 70-80 how spectral radius of the reservoir, a key quantity that characterizes the model, determines the model behavior. Our findings indicate that randomized networks can work well even for complicated sequence-to-sequence prediction NLP tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

Universal Vector Neural Machine Translation With Effective Attention

Neural Machine Translation (NMT) leverages one or more trained neural ne...
research
03/19/2016

Tree-to-Sequence Attentional Neural Machine Translation

Most of the existing Neural Machine Translation (NMT) models focus on th...
research
11/23/2022

Rank-One Editing of Encoder-Decoder Models

Large sequence to sequence models for tasks such as Neural Machine Trans...
research
11/04/2019

On Compositionality in Neural Machine Translation

We investigate two specific manifestations of compositionality in Neural...
research
04/08/2022

C-NMT: A Collaborative Inference Framework for Neural Machine Translation

Collaborative Inference (CI) optimizes the latency and energy consumptio...
research
03/16/2022

Understanding and Improving Sequence-to-Sequence Pretraining for Neural Machine Translation

In this paper, we present a substantial step in better understanding the...
research
09/06/2017

Information-Propogation-Enhanced Neural Machine Translation by Relation Model

Even though sequence-to-sequence neural machine translation (NMT) model ...

Please sign up or login with your details

Forgot password? Click here to reset