Log In Sign Up

Learning compositionally through attentive guidance

by   Dieuwke Hupkes, et al.

In this paper, we introduce Attentive Guidance (AG), a new mechanism to direct a sequence to sequence model equipped with attention to find more compositional solutions that generalise even in cases where the training and testing distribution strongly diverge. We test AG on two tasks, devised precisely to asses the composi- tional capabilities of neural models and show how vanilla sequence to sequence models with attention overfit the training distribution, while the guided versions come up with compositional solutions that, in some cases, fit the training and testing distributions equally well. AG is a simple and intuitive method to provide a learning bias to a sequence to sequence model without the need of including extra components, that we believe allows to inject a component in the training process which is also present in human learning: guidance.


page 6

page 7

page 8

page 12


On the Realization of Compositionality in Neural Networks

We present a detailed comparison of two types of sequence to sequence mo...

Transcoding compositionally: using attention to find more generalizable solutions

While sequence-to-sequence models have shown remarkable generalization p...

Compositional generalization through meta sequence-to-sequence learning

People can learn a new concept and use it compositionally, understanding...

Sequence-to-Sequence Learning with Latent Neural Grammars

Sequence-to-sequence learning with neural networks has become the de fac...

Promising Accurate Prefix Boosting for sequence-to-sequence ASR

In this paper, we present promising accurate prefix boosting (PAPB), a d...

Plan, Attend, Generate: Planning for Sequence-to-Sequence Models

We investigate the integration of a planning mechanism into sequence-to-...