Using logical form encodings for unsupervised linguistic transformation: Theory and applications

02/25/2019
by   Tommo Gröndahl, et al.
0

We present a novel method to architect automatic linguistic transformations for a number of tasks, including controlled grammatical or lexical changes, style transfer, text generation, and machine translation. Our approach consists in creating an abstract representation of a sentence's meaning and grammar, which we use as input to an encoder-decoder network trained to reproduce the original sentence. Manipulating the abstract representation allows the transformation of sentences according to user-provided parameters, both grammatically and lexically, in any combination. Additionally, the same architecture can be used for controlled text generation, and even unsupervised machine translation, where the network is used to translate between different languages using no parallel corpora outside of a lemma-level dictionary. This strategy holds the promise of enabling many tasks that were hitherto outside the scope of NLP techniques for want of sufficient training data. We provide empirical evidence for the effectiveness of our approach by reproducing and transforming English sentences, and evaluating the results both manually and automatically. A single unsupervised model is used for all tasks. We report BLEU scores between 55.29 and 81.82 for sentence reproduction as well as back-and-forth grammatical transformations between 14 class pairs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset