Pretrained language models (PLMs) have produced substantial improvements...
Recent pre-trained language models (PLMs) achieve promising results in
e...
ChatGPT, a large-scale language model based on the advanced GPT-3.5
arch...
Fine-grained information on translation errors is helpful for the transl...
The state-of-the-art language model-based automatic metrics, e.g. BARTSc...
Transfer learning is a simple and powerful method that can be used to bo...
In this paper, we present our submission to the sentence-level MQM bench...
In this report, we present our submission to the WMT 2022 Metrics Shared...
Attention mechanism has become the dominant module in natural language
p...
In this paper, we present our submission to Shared Metrics Task: RoBLEUR...
Translation quality evaluation plays a crucial role in machine translati...
We release 70 small and discriminative test sets for machine translation...
Pre-training (PT) and back-translation (BT) are two simple and powerful
...
The high-quality translation results produced by machine translation (MT...
Previous studies have shown that initializing neural machine translation...
Non-autoregressive translation (NAT) significantly accelerates the infer...
Knowledge distillation (KD) is commonly used to construct synthetic data...
Meta-learning has been sufficiently validated to be beneficial for
low-r...
Encoder layer fusion (EncoderFusion) is a technique to fuse all the enco...
Knowledge distillation (KD) is essential for training non-autoregressive...
Recent studies have proven that the training of neural machine translati...
System combination is an important technique for combining the hypothese...
A neural machine translation (NMT) system is expensive to train, especia...
As a special machine translation task, dialect translation has two main
...
Word embedding is central to neural machine translation (NMT), which has...
Transformer is the state-of-the-art model in recent machine translation
...
Self-attention networks (SAN) have attracted a lot of interests due to t...
Self-attention network (SAN) has recently attracted increasing interest ...
Self-attention networks have proven to be of profound value for its stre...
This paper proposes a hierarchical attentional neural translation model ...