Revisiting the Compositional Generalization Abilities of Neural Sequence Models

03/14/2022
by   Arkil Patel, et al.
0

Compositional generalization is a fundamental trait in humans, allowing us to effortlessly combine known phrases to form novel sentences. Recent works have claimed that standard seq-to-seq models severely lack the ability to compositionally generalize. In this paper, we focus on one-shot primitive generalization as introduced by the popular SCAN benchmark. We demonstrate that modifying the training distribution in simple and intuitive ways enables standard seq-to-seq models to achieve near-perfect generalization performance, thereby showing that their compositional generalization abilities were previously underestimated. We perform detailed empirical analysis of this phenomenon. Our results indicate that the generalization performance of models is highly sensitive to the characteristics of the training data which should be carefully considered while designing such benchmarks in future.

READ FULL TEXT

page 1

page 4

research
04/07/2022

Compositional Generalization and Decomposition in Neural Program Synthesis

When writing programs, people have the ability to tackle a new complex t...
research
02/24/2022

Compositional Generalization Requires Compositional Parsers

A rapidly growing body of research on compositional generalization inves...
research
06/05/2023

Learning to Substitute Spans towards Improving Compositional Generalization

Despite the rising prevalence of neural sequence models, recent empirica...
research
10/12/2020

COGS: A Compositional Generalization Challenge Based on Semantic Interpretation

Natural language is characterized by compositionality: the meaning of a ...
research
05/29/2023

Vector-based Representation is the Key: A Study on Disentanglement and Compositional Generalization

Recognizing elementary underlying concepts from observations (disentangl...
research
12/08/2020

Revisiting Iterative Back-Translation from the Perspective of Compositional Generalization

Human intelligence exhibits compositional generalization (i.e., the capa...
research
05/21/2019

CNNs found to jump around more skillfully than RNNs: Compositional generalization in seq2seq convolutional networks

Lake and Baroni (2018) introduced the SCAN dataset probing the ability o...

Please sign up or login with your details

Forgot password? Click here to reset