Self-Attention for Incomplete Utterance Rewriting

02/24/2022
by   Yong Zhang, et al.
0

Incomplete utterance rewriting (IUR) has recently become an essential task in NLP, aiming to complement the incomplete utterance with sufficient context information for comprehension. In this paper, we propose a novel method by directly extracting the coreference and omission relationship from the self-attention weight matrix of the transformer instead of word embeddings and edit the original text accordingly to generate the complete utterance. Benefiting from the rich information in the self-attention weight matrix, our method achieved competitive results on public IUR datasets.

READ FULL TEXT
research
07/03/2023

Mining Clues from Incomplete Utterance: A Query-enhanced Network for Incomplete Utterance Rewriting

Incomplete utterance rewriting has recently raised wide attention. Howev...
research
09/28/2020

Incomplete Utterance Rewriting as Semantic Segmentation

Recent years the task of incomplete utterance rewriting has raised a lar...
research
07/12/2023

Sumformer: A Linear-Complexity Alternative to Self-Attention for Speech Recognition

Modern speech recognition systems rely on self-attention. Unfortunately,...
research
07/08/2023

Incomplete Utterance Rewriting as Sequential Greedy Tagging

The task of incomplete utterance rewriting has recently gotten much atte...
research
11/28/2022

FsaNet: Frequency Self-attention for Semantic Segmentation

Considering the spectral properties of images, we propose a new self-att...
research
11/24/2019

Enhancing Out-Of-Domain Utterance Detection with Data Augmentation Based on Word Embeddings

For most intelligent assistant systems, it is essential to have a mechan...
research
11/08/2019

iSarcasm: A Dataset of Intended Sarcasm

This paper considers the distinction between intended and perceived sarc...

Please sign up or login with your details

Forgot password? Click here to reset