Decontextualization: Making Sentences Stand-Alone

02/09/2021
by   Eunsol Choi, et al.
7

Models for question answering, dialogue agents, and summarization often interpret the meaning of a sentence in a rich context and use that meaning in a new context. Taking excerpts of text can be problematic, as key pieces may not be explicit in a local window. We isolate and define the problem of sentence decontextualization: taking a sentence together with its context and rewriting it to be interpretable out of context, while preserving its meaning. We describe an annotation procedure, collect data on the Wikipedia corpus, and use the data to train models to automatically decontextualize sentences. We present preliminary studies that show the value of sentence decontextualization in a user facing task, and as preprocessing for systems that perform document understanding. We argue that decontextualization is an important subtask in many downstream applications, and that the definitions and resources provided can benefit tasks that operate on sentences that occur in a richer context.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2021

Text Simplification for Comprehension-based Question-Answering

Text simplification is the process of splitting and rephrasing a sentenc...
research
10/10/2021

What Makes Sentences Semantically Related: A Textual Relatedness Dataset and Empirical Study

The degree of semantic relatedness (or, closeness in meaning) of two uni...
research
01/04/2023

Text sampling strategies for predicting missing bibliographic links

The paper proposes various strategies for sampling text data when perfor...
research
06/01/2018

Some of Them Can be Guessed! Exploring the Effect of Linguistic Context in Predicting Quantifiers

We study the role of linguistic context in predicting quantifiers (`few'...
research
12/02/2019

Fiction Sentence Expansion and Enhancement via Focused Objective and Novelty Curve Sampling

We describe the task of sentence expansion and enhancement, in which a s...
research
09/07/2018

Unsupervised Sentence Compression using Denoising Auto-Encoders

In sentence compression, the task of shortening sentences while retainin...
research
12/20/2022

Measure More, Question More: Experimental Studies on Transformer-based Language Models and Complement Coercion

Transformer-based language models have shown strong performance on an ar...

Please sign up or login with your details

Forgot password? Click here to reset