DeepAI AI Chat
Log In Sign Up

Logic and the 2-Simplicial Transformer

09/02/2019
by   James Clift, et al.
The University of Melbourne
4

We introduce the 2-simplicial Transformer, an extension of the Transformer which includes a form of higher-dimensional attention generalising the dot-product attention, and uses this attention to update entity representations with tensor products of value vectors. We show that this architecture is a useful inductive bias for logical reasoning in the context of deep reinforcement learning.

READ FULL TEXT

page 5

page 6

page 20

page 21

page 22

page 23

10/15/2019

Enhancing the Transformer with Explicit Relational Encoding for Math Problem Solving

We incorporate Tensor-Product Representations within the Transformer in ...
02/19/2023

Learning Language Representations with Logical Inductive Bias

Transformer architectures have achieved great success in solving natural...
04/04/2021

Conversational Question Answering over Knowledge Graphs with Transformer and Graph Attention Networks

This paper addresses the task of (complex) conversational question answe...
07/25/2021

H-Transformer-1D: Fast One-Dimensional Hierarchical Attention for Sequences

We describe an efficient hierarchical method to compute attention in the...
06/02/2021

Enriching Transformers with Structured Tensor-Product Representations for Abstractive Summarization

Abstractive summarization, the task of generating a concise summary of i...
05/02/2022

Logiformer: A Two-Branch Graph Transformer Network for Interpretable Logical Reasoning

Machine reading comprehension has aroused wide concerns, since it explor...
02/15/2021

Translational Equivariance in Kernelizable Attention

While Transformer architectures have show remarkable success, they are b...