R-SQAIR: Relational Sequential Attend, Infer, Repeat

10/11/2019
by   Aleksandar Stanić, et al.
5

Traditional sequential multi-object attention models rely on a recurrent mechanism to infer object relations. We propose a relational extension (R-SQAIR) of one such attention model (SQAIR) by endowing it with a module with strong relational inductive bias that computes in parallel pairwise interactions between inferred objects. Two recently proposed relational modules are studied on tasks of unsupervised learning from videos. We demonstrate gains over sequential relational mechanisms, also in terms of combinatorial generalization.

READ FULL TEXT
research
11/01/2018

SARN: Relational Reasoning through Sequential Attention

This paper proposes an attention module augmented relational network cal...
research
04/01/2023

Abstractors: Transformer Modules for Symbolic Message Passing and Relational Reasoning

A framework is proposed that casts relational learning in terms of trans...
research
06/05/2018

Relational recurrent neural networks

Memory-based neural networks model temporal data by leveraging an abilit...
research
11/08/2021

A Relational Model for One-Shot Classification

We show that a deep learning model with built-in relational inductive bi...
research
06/15/2020

Generalisable Relational Reasoning With Comparators in Low-Dimensional Manifolds

While modern deep neural architectures generalise well when test data is...
research
06/07/2020

Analogy as Nonparametric Bayesian Inference over Relational Systems

Much of human learning and inference can be framed within the computatio...
research
10/09/2018

Unsupervised Object Matching for Relational Data

We propose an unsupervised object matching method for relational data, w...

Please sign up or login with your details

Forgot password? Click here to reset