An Iterative Contextualization Algorithm with Second-Order Attention

03/03/2021
by   Diego Maupomé, et al.
0

Combining the representations of the words that make up a sentence into a cohesive whole is difficult, since it needs to account for the order of words, and to establish how the words present relate to each other. The solution we propose consists in iteratively adjusting the context. Our algorithm starts with a presumably erroneous value of the context, and adjusts this value with respect to the tokens at hand. In order to achieve this, representations of words are built combining their symbolic embedding with a positional encoding into single vectors. The algorithm then iteratively weighs and aggregates these vectors using our novel second-order attention mechanism. Our models report strong results in several well-known text classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2022

The pseudofinite monadic second order theory of words

We analyse the pseudofinite monadic second order theory of words over a ...
research
02/18/2020

Text Classification with Lexicon from PreAttention Mechanism

A comprehensive and high-quality lexicon plays a crucial role in traditi...
research
11/16/2020

Text Information Aggregation with Centrality Attention

A lot of natural language processing problems need to encode the text se...
research
03/14/2023

Input-length-shortening and text generation via attention values

Identifying words that impact a task's performance more than others is a...
research
11/01/2020

Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels

In natural languages, words are used in association to construct sentenc...
research
01/01/2020

Stacked DeBERT: All Attention in Incomplete Data for Text Classification

In this paper, we propose Stacked DeBERT, short for Stacked Denoising Bi...

Please sign up or login with your details

Forgot password? Click here to reset