DeepAI AI Chat
Log In Sign Up

Compositional generalization through meta sequence-to-sequence learning

06/12/2019
by   Brenden M. Lake, et al.
0

People can learn a new concept and use it compositionally, understanding how to "blicket twice" after learning how to "blicket." In contrast, powerful sequence-to-sequence (seq2seq) neural networks fail such tests of compositionality, especially when composing new concepts together with existing concepts. In this paper, I show that neural networks can be trained to generalize compositionally through meta seq2seq learning. In this approach, models train on a series of seq2seq problems to acquire the compositional skills needed to solve new seq2seq problems. Meta se2seq learning solves several of the SCAN tests for compositional learning and can learn to apply rules to variables.

READ FULL TEXT
10/04/2020

Meta Sequence Learning and Its Applications

We present a meta-sequence representation of sentences and demonstrate h...
08/22/2019

The compositionality of neural networks: integrating symbolism and connectionism

Despite a multitude of empirical studies, little consensus exists on whe...
03/14/2020

Synonymous Generalization in Sequence-to-Sequence Recurrent Networks

When learning a language, people can quickly expand their understanding ...
09/12/2018

Jump to better conclusions: SCAN both left and right

Lake and Baroni (2018) recently introduced the SCAN data set, which cons...
12/12/2022

Real-World Compositional Generalization with Disentangled Sequence-to-Sequence Learning

Compositional generalization is a basic mechanism in human language lear...
05/20/2018

Learning compositionally through attentive guidance

In this paper, we introduce Attentive Guidance (AG), a new mechanism to ...