Improving Attention Mechanism with Query-Value Interaction

10/08/2020
by   Chuhan Wu, et al.
0

Attention mechanism has played critical roles in various state-of-the-art NLP models such as Transformer and BERT. It can be formulated as a ternary function that maps the input queries, keys and values into an output by using a summation of values weighted by the attention weights derived from the interactions between queries and keys. Similar with query-key interactions, there is also inherent relatedness between queries and values, and incorporating query-value interactions has the potential to enhance the output by learning customized values according to the characteristics of queries. However, the query-value interactions are ignored by existing attention methods, which may be not optimal. In this paper, we propose to improve the existing attention mechanism by incorporating query-value interactions. We propose a query-value interaction function which can learn query-aware attention values, and combine them with the original values and attention weights to form the final output. Extensive experiments on four datasets for different tasks show that our approach can consistently improve the performance of many attention-based models by incorporating query-value interactions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2022

EIT: Enhanced Interactive Transformer

In this paper, we propose a novel architecture, the Enhanced Interactive...
research
05/17/2023

Exploring the Space of Key-Value-Query Models with Intention

Attention-based models have been a key element of many recent breakthrou...
research
08/22/2021

Guiding Query Position and Performing Similar Attention for Transformer-Based Detection Heads

After DETR was proposed, this novel transformer-based detection paradigm...
research
08/28/2017

Analyzing Query Performance and Attributing Blame for Contentions in a Cluster Computing Framework

Analyzing contention for resources in a cluster computing environment ac...
research
06/11/2020

Attention improves concentration when learning node embeddings

We consider the problem of predicting edges in a graph from node attribu...
research
02/21/2022

Guided Visual Attention Model Based on Interactions Between Top-down and Bottom-up Information for Robot Pose Prediction

Learning to control a robot commonly requires mapping between robot stat...
research
10/18/2021

Compositional Attention: Disentangling Search and Retrieval

Multi-head, key-value attention is the backbone of the widely successful...

Please sign up or login with your details

Forgot password? Click here to reset