TegTok: Augmenting Text Generation via Task-specific and Open-world Knowledge

03/16/2022
by   Chao-Hong Tan, et al.
0

Generating natural and informative texts has been a long-standing problem in NLP. Much effort has been dedicated into incorporating pre-trained language models (PLMs) with various open-world knowledge, such as knowledge graphs or wiki pages. However, their ability to access and manipulate the task-specific knowledge is still limited on downstream tasks, as this type of knowledge is usually not well covered in PLMs and is hard to acquire. To address the problem, we propose augmenting TExt Generation via Task-specific and Open-world Knowledge (TegTok) in a unified framework. Our model selects knowledge entries from two types of knowledge sources through dense retrieval and then injects them into the input encoding and output decoding stages respectively on the basis of PLMs. With the help of these two types of knowledge, our model can learn what and how to generate. Experiments on two text generation tasks of dialogue generation and question generation, and on two datasets show that our method achieves better performance than various baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2020

A Survey of Knowledge-Enhanced Text Generation

The goal of text generation is to make machines express in human languag...
research
05/22/2020

Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks

Large pre-trained language models have been shown to store factual knowl...
research
05/10/2020

Posterior Control of Blackbox Generation

Text generation often requires high-precision output that obeys task-spe...
research
02/24/2023

Few-Shot Table-to-Text Generation with Prompt-based Adapter

Pre-trained language models (PLMs) have made remarkable progress in tabl...
research
10/05/2020

KGPT: Knowledge-Grounded Pre-Training for Data-to-Text Generation

Data-to-text generation has recently attracted substantial interests due...
research
07/16/2021

Urdu Hindi Poetry Generation using Neural Networks

One of the major problems writers and poets face is the writer's block. ...
research
10/17/2021

Quantifying the Task-Specific Information in Text-Based Classifications

Recently, neural natural language models have attained state-of-the-art ...

Please sign up or login with your details

Forgot password? Click here to reset