Multi-Task Neural Models for Translating Between Styles Within and Across Languages

06/12/2018
by   Xing Niu, et al.
0

Generating natural language requires conveying content in an appropriate style. We explore two related tasks on generating text of varying formality: monolingual formality transfer and formality-sensitive machine translation. We propose to solve these tasks jointly using multi-task learning, and show that our models achieve state-of-the-art performance for formality transfer and are able to perform formality-sensitive translation without being explicitly trained on style-annotated translation examples.

READ FULL TEXT
research
08/03/2017

Exploiting Linguistic Resources for Neural Machine Translation Using Multi-task Learning

Linguistic resources such as part-of-speech (POS) tags have been extensi...
research
05/11/2018

Neural Machine Translation for Bilingually Scarce Scenarios: A Deep Multi-task Learning Approach

Neural machine translation requires large amounts of parallel training t...
research
09/21/2022

Extreme Multi-Domain, Multi-Task Learning With Unified Text-to-Text Transfer Transformers

Text-to-text transformers have shown remarkable success in the task of m...
research
10/24/2020

Multi-Task Learning with Shared Encoder for Non-Autoregressive Machine Translation

Non-Autoregressive machine Translation (NAT) models have demonstrated si...
research
11/18/2018

Neural Multi-Task Learning for Citation Function and Provenance

Citation function and provenance are two cornerstone tasks in citation a...
research
02/25/2022

The Reality of Multi-Lingual Machine Translation

Our book "The Reality of Multi-Lingual Machine Translation" discusses th...
research
12/13/2022

Egocentric Video Task Translation

Different video understanding tasks are typically treated in isolation, ...

Please sign up or login with your details

Forgot password? Click here to reset