Integrating Dependency Tree Into Self-attention for Sentence Representation

03/11/2022
by   Junhua Ma, et al.
0

Recent progress on parse tree encoder for sentence representation learning is notable. However, these works mainly encode tree structures recursively, which is not conducive to parallelization. On the other hand, these works rarely take into account the labels of arcs in dependency trees. To address both issues, we propose Dependency-Transformer, which applies a relation-attention mechanism that works in concert with the self-attention mechanism. This mechanism aims to encode the dependency and the spatial positional relations between nodes in the dependency tree of sentences. By a score-based method, we successfully inject the syntax information without affecting Transformer's parallelizability. Our model outperforms or is comparable to the state-of-the-art methods on four tasks for sentence representation and has obvious advantages in computational efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/25/2021

Combining Improvements for Exploiting Dependency Trees in Neural Semantic Parsing

The dependency tree of a natural language sentence can capture the inter...
research
12/06/2017

Distance-based Self-Attention Network for Natural Language Inference

Attention mechanism has been used as an ancillary means to help RNN or C...
research
09/05/2019

Source Dependency-Aware Transformer with Supervised Self-Attention

Recently, Transformer has achieved the state-of-the-art performance on m...
research
04/06/2023

Visual Dependency Transformers: Dependency Tree Emerges from Reversed Attention

Humans possess a versatile mechanism for extracting structured represent...
research
03/07/2021

Orthogonal Attention: A Cloze-Style Approach to Negation Scope Resolution

Negation Scope Resolution is an extensively researched problem, which is...
research
10/01/2017

DTATG: An Automatic Title Generator based on Dependency Trees

We study automatic title generation for a given block of text and presen...
research
06/26/2019

A Generative Model for Punctuation in Dependency Trees

Treebanks traditionally treat punctuation marks as ordinary words, but l...

Please sign up or login with your details

Forgot password? Click here to reset