Logic and the 2-Simplicial Transformer

09/02/2019
by   James Clift, et al.
4

We introduce the 2-simplicial Transformer, an extension of the Transformer which includes a form of higher-dimensional attention generalising the dot-product attention, and uses this attention to update entity representations with tensor products of value vectors. We show that this architecture is a useful inductive bias for logical reasoning in the context of deep reinforcement learning.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset