DeepAI
Log In Sign Up

Learning compositionally through attentive guidance

05/20/2018
by   Dieuwke Hupkes, et al.
0

In this paper, we introduce Attentive Guidance (AG), a new mechanism to direct a sequence to sequence model equipped with attention to find more compositional solutions that generalise even in cases where the training and testing distribution strongly diverge. We test AG on two tasks, devised precisely to asses the composi- tional capabilities of neural models and show how vanilla sequence to sequence models with attention overfit the training distribution, while the guided versions come up with compositional solutions that, in some cases, fit the training and testing distributions equally well. AG is a simple and intuitive method to provide a learning bias to a sequence to sequence model without the need of including extra components, that we believe allows to inject a component in the training process which is also present in human learning: guidance.

READ FULL TEXT

page 6

page 7

page 8

page 12

06/04/2019

On the Realization of Compositionality in Neural Networks

We present a detailed comparison of two types of sequence to sequence mo...
06/04/2019

Transcoding compositionally: using attention to find more generalizable solutions

While sequence-to-sequence models have shown remarkable generalization p...
06/12/2019

Compositional generalization through meta sequence-to-sequence learning

People can learn a new concept and use it compositionally, understanding...
09/02/2021

Sequence-to-Sequence Learning with Latent Neural Grammars

Sequence-to-sequence learning with neural networks has become the de fac...
11/07/2018

Promising Accurate Prefix Boosting for sequence-to-sequence ASR

In this paper, we present promising accurate prefix boosting (PAPB), a d...
11/28/2017

Plan, Attend, Generate: Planning for Sequence-to-Sequence Models

We investigate the integration of a planning mechanism into sequence-to-...