A Structured Self-attentive Sentence Embedding

03/09/2017
by   Zhouhan Lin, et al.
0

This paper proposes a new model for extracting an interpretable sentence embedding by introducing self-attention. Instead of using a vector, we use a 2-D matrix to represent the embedding, with each row of the matrix attending on a different part of the sentence. We also propose a self-attention mechanism and a special regularization term for the model. As a side effect, the embedding comes with an easy way of visualizing what specific parts of the sentence are encoded into the embedding. We evaluate our model on 3 different tasks: author profiling, sentiment classification, and textual entailment. Results show that our model yields a significant performance gain compared to other sentence embedding methods in all of the 3 tasks.

READ FULL TEXT

page 2

page 8

research
08/22/2018

Dynamic Self-Attention : Computing Attention over Words Dynamically for Sentence Embedding

In this paper, we propose Dynamic Self-Attention (DSA), a new self-atten...
research
06/26/2018

Enhancing Sentence Embedding with Generalized Pooling

Pooling is an essential component of a wide variety of sentence represen...
research
12/02/2017

Improving Visually Grounded Sentence Representations with Self-Attention

Sentence representation models trained only on language could potentiall...
research
05/10/2018

Obligation and Prohibition Extraction Using Hierarchical RNNs

We consider the task of detecting contractual obligations and prohibitio...
research
10/01/2022

Construction and Evaluation of a Self-Attention Model for Semantic Understanding of Sentence-Final Particles

Sentence-final particles serve an essential role in spoken Japanese beca...
research
09/03/2018

Multi-Level Structured Self-Attentions for Distantly Supervised Relation Extraction

Attention mechanisms are often used in deep neural networks for distantl...
research
05/22/2017

A Regularized Framework for Sparse and Structured Neural Attention

Modern neural networks are often augmented with an attention mechanism, ...

Please sign up or login with your details

Forgot password? Click here to reset