DeepAI AI Chat
Log In Sign Up

Learning to Generalize Compositionally by Transferring Across Semantic Parsing Tasks

11/09/2021
by   Wang Zhu, et al.
4

Neural network models often generalize poorly to mismatched domains or distributions. In NLP, this issue arises in particular when models are expected to generalize compositionally, that is, to novel combinations of familiar words and constructions. We investigate learning representations that facilitate transfer learning from one compositional task to another: the representation and the task-specific layers of the models are strategically trained differently on a pre-finetuning task such that they generalize well on mismatched splits that require compositionality. We apply this method to semantic parsing, using three very different datasets, COGS, GeoQuery and SCAN, used alternately as the pre-finetuning and target task. Our method significantly improves compositional generalization over baselines on the test set of the target task, which is held out during fine-tuning. Ablation studies characterize the utility of the major steps in the proposed algorithm and support our hypothesis.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/17/2020

Compositional Generalization in Semantic Parsing: Pre-training vs. Specialized Architectures

While mainstream machine learning methods are known to have limited abil...
08/09/2021

Making Transformers Solve Compositional Tasks

Several studies have reported the inability of Transformer models to gen...
05/24/2022

Evaluating the Impact of Model Scale for Compositional Generalization in Semantic Parsing

Despite their strong performance on many tasks, pre-trained language mod...
11/15/2022

On the Compositional Generalization Gap of In-Context Learning

Pretrained large generative language models have shown great performance...
10/12/2020

COGS: A Compositional Generalization Challenge Based on Semantic Interpretation

Natural language is characterized by compositionality: the meaning of a ...
06/13/2020

MetaPerturb: Transferable Regularizer for Heterogeneous Tasks and Architectures

Regularization and transfer learning are two popular techniques to enhan...
03/16/2022

Structurally Diverse Sampling Reduces Spurious Correlations in Semantic Parsing Datasets

A rapidly growing body of research has demonstrated the inability of NLP...