Learning to Reason with Third-Order Tensor Products

11/29/2018
by   Imanol Schlag, et al.
12

We combine Recurrent Neural Networks with Tensor Product Representations to learn combinatorial representations of sequential data. This improves symbolic interpretation and systematic generalisation. Our architecture is trained end-to-end through gradient descent on a variety of simple natural language reasoning tasks, significantly outperforming the latest state-of-the-art models in single-task and all-tasks settings. We also augment a subset of the data such that training and test data exhibit large systematic differences and show that our approach generalises better than the previous state-of-the-art.

READ FULL TEXT

page 8

page 15

page 16

page 17

research
12/25/2020

Logic Tensor Networks

Artificial Intelligence agents are required to learn from their surround...
research
12/20/2018

RNNs Implicitly Implement Tensor Product Representations

Recurrent neural networks (RNNs) can learn continuous vector representat...
research
06/29/2023

A Hybrid System for Systematic Generalization in Simple Arithmetic Problems

Solving symbolic reasoning problems that require compositionality and sy...
research
07/06/2017

Tensor-Train Recurrent Neural Networks for Video Classification

The Recurrent Neural Networks and their variants have shown promising pe...
research
10/29/2018

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...
research
09/11/2020

Systematic Generalization on gSCAN with Language Conditioned Embedding

Systematic Generalization refers to a learning algorithm's ability to ex...
research
05/26/2023

LLMs and the Abstraction and Reasoning Corpus: Successes, Failures, and the Importance of Object-based Representations

Can a Large Language Model (LLM) solve simple abstract reasoning problem...

Please sign up or login with your details

Forgot password? Click here to reset