RenewNAT: Renewing Potential Translation for Non-Autoregressive Transformer

03/14/2023
by   Pei Guo, et al.
0

Non-autoregressive neural machine translation (NAT) models are proposed to accelerate the inference process while maintaining relatively high performance. However, existing NAT models are difficult to achieve the desired efficiency-quality trade-off. For one thing, fully NAT models with efficient inference perform inferior to their autoregressive counterparts. For another, iterative NAT models can, though, achieve comparable performance while diminishing the advantage of speed. In this paper, we propose RenewNAT, a flexible framework with high efficiency and effectiveness, to incorporate the merits of fully and iterative NAT models. RenewNAT first generates the potential translation results and then renews them in a single pass. It can achieve significant performance improvements at the same expense as traditional NAT models (without introducing additional model parameters and decoding latency). Experimental results on various translation benchmarks (e.g., 4 WMT) show that our framework consistently improves the performance of strong fully NAT methods (e.g., GLAT and DSLP) without additional speed overhead.

READ FULL TEXT
research
10/14/2021

Non-Autoregressive Translation with Layer-Wise Prediction and Deep Supervision

How do we perform efficient inference while retaining high translation q...
research
12/31/2020

Fully Non-autoregressive Neural Machine Translation: Tricks of the Trade

Fully non-autoregressive neural machine translation (NAT) is proposed to...
research
01/22/2021

Enriching Non-Autoregressive Transformer with Syntactic and SemanticStructures for Neural Machine Translation

The non-autoregressive models have boosted the efficiency of neural mach...
research
10/11/2022

Viterbi Decoding of Directed Acyclic Transformer for Non-Autoregressive Machine Translation

Non-autoregressive models achieve significant decoding speedup in neural...
research
01/27/2023

Candidate Soups: Fusing Candidate Results Improves Translation Quality for Non-Autoregressive Translation

Non-autoregressive translation (NAT) model achieves a much faster infere...
research
04/28/2022

Neighbors Are Not Strangers: Improving Non-Autoregressive Translation under Low-Frequency Lexical Constraints

However, current autoregressive approaches suffer from high latency. In ...
research
05/25/2023

Revisiting Non-Autoregressive Translation at Scale

In real-world systems, scaling has been critical for improving the trans...

Please sign up or login with your details

Forgot password? Click here to reset