Exploring the Space of Key-Value-Query Models with Intention

05/17/2023
by   Marta Garnelo, et al.
0

Attention-based models have been a key element of many recent breakthroughs in deep learning. Two key components of Attention are the structure of its input (which consists of keys, values and queries) and the computations by which these three are combined. In this paper we explore the space of models that share said input structure but are not restricted to the computations of Attention. We refer to this space as Keys-Values-Queries (KVQ) Space. Our goal is to determine whether there are any other stackable models in KVQ Space that Attention cannot efficiently approximate, which we can implement with our current deep learning toolbox and that solve problems that are interesting to the community. Maybe surprisingly, the solution to the standard least squares problem satisfies these properties. A neural network module that is able to compute this solution not only enriches the set of computations that a neural network can represent but is also provably a strict generalisation of Linear Attention. Even more surprisingly the computational complexity of this module is exactly the same as that of Attention, making it a suitable drop in replacement. With this novel connection between classical machine learning (least squares) and modern deep learning (Attention) established we justify a variation of our model which generalises regular Attention in the same way. Both new modules are put to the test an a wide spectrum of tasks ranging from few-shot learning to policy distillation that confirm their real-worlds applicability.

READ FULL TEXT

page 4

page 14

research
10/08/2020

Improving Attention Mechanism with Query-Value Interaction

Attention mechanism has played critical roles in various state-of-the-ar...
research
05/05/2021

Learning Feature Aggregation for Deep 3D Morphable Models

3D morphable models are widely used for the shape representation of an o...
research
10/01/2018

Set Transformer

Many machine learning tasks such as multiple instance learning, 3D shape...
research
07/02/2023

Classifying World War II Era Ciphers with Machine Learning

We determine the accuracy with which machine learning and deep learning ...
research
04/04/2023

Form-NLU: Dataset for the Form Language Understanding

Compared to general document analysis tasks, form document structure und...
research
07/08/2021

Grid Partitioned Attention: Efficient TransformerApproximation with Inductive Bias for High Resolution Detail Generation

Attention is a general reasoning mechanism than can flexibly deal with i...
research
10/11/2020

SMYRF: Efficient Attention using Asymmetric Clustering

We propose a novel type of balanced clustering algorithm to approximate ...

Please sign up or login with your details

Forgot password? Click here to reset