Despite remarkable advancements in few-shot generalization in natural
la...
Multilingual sequence-to-sequence models perform poorly with increased
l...
While prior work has established that the use of parallel data is conduc...
In text generation, models that generate text from scratch one token at ...
We present M2D2, a fine-grained, massively multi-domain corpus for study...
Most existing sequence generation models produce outputs in one pass, us...
Pretrained large language models (LLMs) are widely used in many sub-fiel...
Fine-tuning reinforcement learning (RL) models has been challenging beca...
Reproducible benchmarks are crucial in driving progress of machine
trans...
Despite the success of multilingual sequence-to-sequence pretraining, mo...
Many types of text style transfer can be achieved with only small, preci...
The advent of the Transformer can arguably be described as a driving for...
In this paper, we tackle the task of definition modeling, where the goal...
Document editing has become a pervasive component of production of
infor...
The contrast between the need for large amounts of data for current Natu...