Controllable Paraphrasing and Translation with a Syntactic Exemplar

10/12/2020
by   Mingda Chen, et al.
0

Most prior work on exemplar-based syntactically controlled paraphrase generation relies on automatically-constructed large-scale paraphrase datasets. We sidestep this prerequisite by adapting models from prior work to be able to learn solely from bilingual text (bitext). Despite only using bitext for training, and in near zero-shot conditions, our single proposed model can perform four tasks: controlled paraphrase generation in both languages and controlled machine translation in both language directions. To evaluate these tasks quantitatively, we create three novel evaluation datasets. Our experimental results show that our models achieve competitive results on controlled paraphrase generation and strong performance on controlled machine translation. Analysis shows that our models learn to disentangle semantics and syntax in their latent representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/03/2019

Controllable Paraphrase Generation with a Syntactic Exemplar

Prior work on controllable text generation usually assumes that the cont...
06/15/2021

Language Tags Matter for Zero-Shot Neural Machine Translation

Multilingual Neural Machine Translation (MNMT) has aroused widespread in...
09/09/2022

Adapting to Non-Centered Languages for Zero-shot Multilingual Translation

Multilingual neural machine translation can translate unseen language pa...
09/10/2021

Rethinking Zero-shot Neural Machine Translation: From a Perspective of Latent Variables

Zero-shot translation, directly translating between language pairs unsee...
12/30/2020

Improving Zero-Shot Translation by Disentangling Positional Information

Multilingual neural machine translation has shown the capability of dire...
02/01/2022

Examining Scaling and Transfer of Language Model Architectures for Machine Translation

Natural language understanding and generation models follow one of the t...
05/12/2022

Exploiting Inductive Bias in Transformers for Unsupervised Disentanglement of Syntax and Semantics with VAEs

We propose a generative model for text generation, which exhibits disent...

Code Repositories

mVGVAE

Code, data, and pretrained models for the paper "Controllable Paraphrasing and Translation with a Syntactic Exemplar"


view repo