Lie Access Neural Turing Machine

02/28/2016
by   Greg Yang, et al.
0

Following the recent trend in explicit neural memory structures, we present a new design of an external memory, wherein memories are stored in an Euclidean key space R^n. An LSTM controller performs read and write via specialized read and write heads. It can move a head by either providing a new address in the key space (aka random access) or moving from its previous position via a Lie group action (aka Lie access). In this way, the "L" and "R" instructions of a traditional Turing Machine are generalized to arbitrary elements of a fixed Lie group action. For this reason, we name this new model the Lie Access Neural Turing Machine, or LANTM. We tested two different configurations of LANTM against an LSTM baseline in several basic experiments. We found the right configuration of LANTM to outperform the baseline in all of our experiments. In particular, we trained LANTM on addition of k-digit numbers for 2 < k < 16, but it was able to generalize almost perfectly to 17 < k < 32, all with the number of parameters 2 orders of magnitude below the LSTM baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2016

Lie-Access Neural Turing Machines

External neural memory structures have recently become a popular tool fo...
research
06/30/2016

Dynamic Neural Turing Machine with Soft and Hard Addressing Schemes

We extend neural Turing machine (NTM) model into a dynamic neural Turing...
research
10/27/2020

A short note on the decision tree based neural turing machine

Turing machine and decision tree have developed independently for a long...
research
11/02/2020

Machine Learning Lie Structures Applications to Physics

Classical and exceptional Lie algebras and their representations are amo...
research
01/10/2023

Memory Augmented Large Language Models are Computationally Universal

We show that transformer-based large language models are computationally...
research
06/08/2022

Parametric Lie group structures on the probabilistic simplex and generalized Compositional Data

In this paper we build a set of parametric quotient Lie group structures...
research
01/10/2019

On the Turing Completeness of Modern Neural Network Architectures

Alternatives to recurrent neural networks, in particular, architectures ...

Please sign up or login with your details

Forgot password? Click here to reset