Abstractors: Transformer Modules for Symbolic Message Passing and Relational Reasoning

04/01/2023
by   Awni Altabaa, et al.
0

A framework is proposed that casts relational learning in terms of transformers, implementing binding between sensory states and abstract states with relational cross attention mechanisms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2018

Dilated DenseNets for Relational Reasoning

Despite their impressive performance in many tasks, deep neural networks...
research
05/27/2023

Graph Inductive Biases in Transformers without Message Passing

Transformers for graph data are increasingly widely studied and successf...
research
10/11/2019

R-SQAIR: Relational Sequential Attend, Infer, Repeat

Traditional sequential multi-object attention models rely on a recurrent...
research
09/29/2020

Message Passing Neural Processes

Neural Processes (NPs) are powerful and flexible models able to incorpor...
research
05/19/2021

Complementary Structure-Learning Neural Networks for Relational Reasoning

The neural mechanisms supporting flexible relational inferences, especia...
research
04/22/2020

Graph-based Kinship Reasoning Network

In this paper, we propose a graph-based kinship reasoning (GKR) network ...
research
01/31/2019

Learning to Make Analogies by Contrasting Abstract Relational Structure

Analogical reasoning has been a principal focus of various waves of AI r...

Please sign up or login with your details

Forgot password? Click here to reset