Transcoding compositionally: using attention to find more generalizable solutions

06/04/2019
by   Kris Korrel, et al.
0

While sequence-to-sequence models have shown remarkable generalization power across several natural language tasks, their construct of solutions are argued to be less compositional than human-like generalization. In this paper, we present seq2attn, a new architecture that is specifically designed to exploit attention to find compositional patterns in the input. In seq2attn, the two standard components of an encoder-decoder model are connected via a transcoder, that modulates the information flow between them. We show that seq2attn can successfully generalize, without requiring any additional supervision, on two tasks which are specifically constructed to challenge the compositional skills of neural networks. The solutions found by the model are highly interpretable, allowing easy analysis of both the types of solutions that are found and potential causes for mistakes. We exploit this opportunity to introduce a new paradigm to test compositionality that studies the extent to which a model overgeneralizes when confronted with exceptions. We show that seq2attn exhibits such overgeneralization to a larger degree than a standard sequence-to-sequence model.

READ FULL TEXT

page 6

page 8

research
06/19/2021

Improving Compositional Generalization in Classification Tasks via Structure Annotations

Compositional generalization is the ability to generalize systematically...
research
10/09/2021

Disentangled Sequence to Sequence Learning for Compositional Generalization

There is mounting evidence that existing neural network models, in parti...
research
05/20/2018

Learning compositionally through attentive guidance

In this paper, we introduce Attentive Guidance (AG), a new mechanism to ...
research
12/12/2022

Real-World Compositional Generalization with Disentangled Sequence-to-Sequence Learning

Compositional generalization is a basic mechanism in human language lear...
research
06/04/2019

On the Realization of Compositionality in Neural Networks

We present a detailed comparison of two types of sequence to sequence mo...
research
03/14/2020

Synonymous Generalization in Sequence-to-Sequence Recurrent Networks

When learning a language, people can quickly expand their understanding ...
research
07/20/2023

Layer-wise Representation Fusion for Compositional Generalization

Despite successes across a broad range of applications, sequence-to-sequ...

Please sign up or login with your details

Forgot password? Click here to reset