Controllable Paraphrasing and Translation with a Syntactic Exemplar

10/12/2020 ∙ by Mingda Chen, et al. ∙ 0

Most prior work on exemplar-based syntactically controlled paraphrase generation relies on automatically-constructed large-scale paraphrase datasets. We sidestep this prerequisite by adapting models from prior work to be able to learn solely from bilingual text (bitext). Despite only using bitext for training, and in near zero-shot conditions, our single proposed model can perform four tasks: controlled paraphrase generation in both languages and controlled machine translation in both language directions. To evaluate these tasks quantitatively, we create three novel evaluation datasets. Our experimental results show that our models achieve competitive results on controlled paraphrase generation and strong performance on controlled machine translation. Analysis shows that our models learn to disentangle semantics and syntax in their latent representations.



There are no comments yet.


page 1

page 2

page 3

page 4

Code Repositories


Code, data, and pretrained models for the paper "Controllable Paraphrasing and Translation with a Syntactic Exemplar"

view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.