Self-Attention for Incomplete Utterance Rewriting

02/24/2022
by   Yong Zhang, et al.
0

Incomplete utterance rewriting (IUR) has recently become an essential task in NLP, aiming to complement the incomplete utterance with sufficient context information for comprehension. In this paper, we propose a novel method by directly extracting the coreference and omission relationship from the self-attention weight matrix of the transformer instead of word embeddings and edit the original text accordingly to generate the complete utterance. Benefiting from the rich information in the self-attention weight matrix, our method achieved competitive results on public IUR datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset