Log In Sign Up

Still not systematic after all these years: On the compositional skills of sequence-to-sequence recurrent networks

by   Brenden M. Lake, et al.

Humans can understand and produce new utterances effortlessly, thanks to their systematic compositional skills. Once a person learns the meaning of a new verb "dax," he or she can immediately understand the meaning of "dax twice" or "sing and dax." In this paper, we introduce the SCAN domain, consisting of a set of simple compositional navigation commands paired with the corresponding action sequences. We then test the zero-shot generalization capabilities of a variety of recurrent neural networks (RNNs) trained on SCAN with sequence-to-sequence methods. We find that RNNs can generalize well when the differences between training and test commands are small, so that they can apply "mix-and-match" strategies to solve the task. However, when generalization requires systematic compositional skills (as in the "dax" example above), RNNs fail spectacularly. We conclude with a proof-of-concept experiment in neural machine translation, supporting the conjecture that lack of systematicity is an important factor explaining why neural networks need very large training sets.


page 1

page 2

page 3

page 4


Compositional generalization through meta sequence-to-sequence learning

People can learn a new concept and use it compositionally, understanding...

Synonymous Generalization in Sequence-to-Sequence Recurrent Networks

When learning a language, people can quickly expand their understanding ...

Memorize or generalize? Searching for a compositional RNN in a haystack

Neural networks are very powerful learning systems, but they do not read...

Jump to better conclusions: SCAN both left and right

Lake and Baroni (2018) recently introduced the SCAN data set, which cons...

Zero-Shot Generalization using Intrinsically Motivated Compositional Emergent Protocols

Human language has been described as a system that makes use of finite m...

Learning compositionally through attentive guidance

In this paper, we introduce Attentive Guidance (AG), a new mechanism to ...

Code Repositories


Simple language-driven navigation tasks for studying compositional learning

view repo