Grounded Graph Decoding Improves Compositional Generalization in Question Answering

11/05/2021
by   Yu Gai, et al.
16

Question answering models struggle to generalize to novel compositions of training patterns, such to longer sequences or more complex test structures. Current end-to-end models learn a flat input embedding which can lose input syntax context. Prior approaches improve generalization by learning permutation invariant models, but these methods do not scale to more complex train-test splits. We propose Grounded Graph Decoding, a method to improve compositional generalization of language representations by grounding structured predictions with an attention mechanism. Grounding enables the model to retain syntax information from the input in thereby significantly improving generalization over complex inputs. By predicting a structured graph containing conjunctions of query clauses, we learn a group invariant representation without making assumptions on the target domain. Our model significantly outperforms state-of-the-art baselines on the Compositional Freebase Questions (CFQ) dataset, a challenging benchmark for compositional generalization in question answering. Moreover, we effectively solve the MCD1 split with 98

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2020

Latent Compositional Representations Improve Systematic Generalization in Grounded Question Answering

Answering questions that involve multi-step reasoning requires decomposi...
research
10/15/2020

Hierarchical Poset Decoding for Compositional Generalization in Language

We formalize human language understanding as a structured prediction tas...
research
09/22/2021

COVR: A test-bed for Visually Grounded Compositional Generalization with real images

While interest in models that generalize at test time to new composition...
research
12/10/2018

Learning Representations of Sets through Optimized Permutations

Representations of sets are challenging to learn because operations on s...
research
09/02/2021

Challenges in Generalization in Open Domain Question Answering

Recent work on Open Domain Question Answering has shown that there is a ...
research
10/23/2022

When Can Transformers Ground and Compose: Insights from Compositional Generalization Benchmarks

Humans can reason compositionally whilst grounding language utterances t...
research
11/16/2020

Beyond I.I.D.: Three Levels of Generalization for Question Answering on Knowledge Bases

Existing studies on question answering on knowledge bases (KBQA) mainly ...

Please sign up or login with your details

Forgot password? Click here to reset