A Semi-Supervised Approach for Low-Resourced Text Generation

06/03/2019
by   Hongyu Zang, et al.
0

Recently, encoder-decoder neural models have achieved great success on text generation tasks. However, one problem of this kind of models is that their performances are usually limited by the scale of well-labeled data, which are very expensive to get. The low-resource (of labeled data) problem is quite common in different task generation tasks, but unlabeled data are usually abundant. In this paper, we propose a method to make use of the unlabeled data to improve the performance of such models in the low-resourced circumstances. We use denoising auto-encoder (DAE) and language model (LM) based reinforcement learning (RL) to enhance the training of encoder and decoder with unlabeled data. Our method shows adaptability for different text generation tasks, and makes significant improvements over basic text generation models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2018

Learning Neural Templates for Text Generation

While neural, encoder-decoder models have had significant empirical succ...
research
06/06/2022

Curriculum-Based Self-Training Makes Better Few-Shot Learners for Data-to-Text Generation

Despite the success of text-to-text pre-trained models in various natura...
research
10/04/2019

Controlled Text Generation for Data Augmentation in Intelligent Artificial Agents

Data availability is a bottleneck during early stages of development of ...
research
11/27/2017

Neural Text Generation: A Practical Guide

Deep learning methods have recently achieved great empirical success on ...
research
02/06/2018

Byte-Level Recursive Convolutional Auto-Encoder for Text

This article proposes to auto-encode text at byte-level using convolutio...
research
10/09/2017

Multitask training with unlabeled data for end-to-end sign language fingerspelling recognition

We address the problem of automatic American Sign Language fingerspellin...

Please sign up or login with your details

Forgot password? Click here to reset