Making Transformers Solve Compositional Tasks

08/09/2021
by   Santiago Ontañón, et al.
0

Several studies have reported the inability of Transformer models to generalize compositionally, a key type of generalization in many NLP tasks such as semantic parsing. In this paper we explore the design space of Transformer models showing that the inductive biases given to the model by several design decisions significantly impact compositional generalization. Through this exploration, we identified Transformer configurations that generalize compositionally significantly better than previously reported in the literature in a diverse set of compositional tasks, and that achieve state-of-the-art results in a semantic parsing compositional generalization benchmark (COGS), and a string edit operation composition benchmark (PCFG).

READ FULL TEXT
research
06/20/2023

On Evaluating Multilingual Compositional Generalization with Translated Datasets

Compositional generalization allows efficient learning and human-like in...
research
02/20/2022

Understanding Robust Generalization in Learning Regular Languages

A key feature of human intelligence is the ability to generalize beyond ...
research
01/30/2022

Compositionality as Lexical Symmetry

Standard deep network models lack the inductive biases needed to general...
research
11/09/2021

Learning to Generalize Compositionally by Transferring Across Semantic Parsing Tasks

Neural network models often generalize poorly to mismatched domains or d...
research
02/15/2023

On graph-based reentrancy-free semantic parsing

We propose a novel graph-based approach for semantic parsing that resolv...
research
10/08/2021

Iterative Decoding for Compositional Generalization in Transformers

Deep learning models do well at generalizing to in-distribution data but...
research
10/10/2020

Compressing Transformer-Based Semantic Parsing Models using Compositional Code Embeddings

The current state-of-the-art task-oriented semantic parsing models use B...

Please sign up or login with your details

Forgot password? Click here to reset