RNNs Implicitly Implement Tensor Product Representations

12/20/2018
by   R. Thomas McCoy, et al.
0

Recurrent neural networks (RNNs) can learn continuous vector representations of symbolic structures such as sequences and sentences; these representations often exhibit linear regularities (analogies). Such regularities motivate our hypothesis that RNNs that show such regularities implicitly compile symbolic structures into tensor product representations (TPRs; Smolensky, 1990), which additively combine tensor products of vectors representing roles (e.g., sequence positions) and vectors representing fillers (e.g., particular words). To test this hypothesis, we introduce Tensor Product Decomposition Networks (TPDNs), which use TPRs to approximate existing vector representations. We demonstrate using synthetic data that TPDNs can successfully approximate linear and tree-based RNN autoencoder representations, suggesting that these representations exhibit interpretable compositional structure; we explore the settings that lead RNNs to induce such structure-sensitive representations. By contrast, further TPDN experiments show that the representations of four models trained to encode naturally-occurring sentences can be largely approximated with a bag-of-words, with only marginal improvements from more sophisticated structures. We conclude that TPDNs provide a powerful method for interpreting vector representations, and that standard RNNs can induce compositional sequence representations that are remarkably well approximated by TPRs; at the same time, existing training tasks for sentence representation learning may not be sufficient for inducing robust structural representations.

READ FULL TEXT
research
10/21/2019

Discovering the Compositional Structure of Vector Representations with Role Learning Networks

Neural networks (NNs) are able to perform tasks that rely on composition...
research
11/29/2018

Learning to Reason with Third-Order Tensor Products

We combine Recurrent Neural Networks with Tensor Product Representations...
research
10/29/2018

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...
research
02/18/2018

Memorize or generalize? Searching for a compositional RNN in a haystack

Neural networks are very powerful learning systems, but they do not read...
research
07/15/2019

GLOSS: Generative Latent Optimization of Sentence Representations

We propose a method to learn unsupervised sentence representations in a ...
research
06/12/2018

Evaluation of Unsupervised Compositional Representations

We evaluated various compositional models, from bag-of-words representat...
research
10/24/2021

Distributed neural encoding of binding to thematic roles

A framework and method are proposed for the study of constituent composi...

Please sign up or login with your details

Forgot password? Click here to reset