Set Transformer

10/01/2018
by   Juho Lee, et al.
10

Many machine learning tasks such as multiple instance learning, 3D shape recognition and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the permutation of elements of the set, models used to address them should be permutation invariant. We present an attention-based neural network module, the Set Transformer, specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces computation time of self-attention from quadratic to linear in the number of elements in the set. We show that our model is theoretically attractive and we evaluate it on a range of tasks, demonstrating increased performance compared to recent methods for set-structured data.

READ FULL TEXT

page 3

page 15

page 16

research
02/28/2023

Sampled Transformer for Point Sets

The sparse transformer can reduce the computational complexity of the se...
research
06/08/2022

Set Interdependence Transformer: Set-to-Sequence Neural Networks for Permutation Learning and Structure Prediction

The task of learning to map an input set onto a permuted sequence of its...
research
03/05/2021

Set Representation Learning with Generalized Sliced-Wasserstein Embeddings

An increasing number of machine learning tasks deal with learning repres...
research
05/17/2023

Exploring the Space of Key-Value-Query Models with Intention

Attention-based models have been a key element of many recent breakthrou...
research
04/28/2022

Attention Based Neural Networks for Wireless Channel Estimation

In this paper, we deploy the self-attention mechanism to achieve improve...
research
07/17/2021

PICASO: Permutation-Invariant Cascaded Attentional Set Operator

Set-input deep networks have recently drawn much interest in computer vi...
research
06/26/2020

Conditional Set Generation with Transformers

A set is an unordered collection of unique elements–and yet many machine...

Please sign up or login with your details

Forgot password? Click here to reset