DeepAI AI Chat
Log In Sign Up

Compositional Generalization for Primitive Substitutions

10/07/2019
by   Yuanpeng Li, et al.
Baidu, Inc.
0

Compositional generalization is a basic mechanism in human language learning, but current neural networks lack such ability. In this paper, we conduct fundamental research for encoding compositionality in neural networks. Conventional methods use a single representation for the input sentence, making it hard to apply prior knowledge of compositionality. In contrast, our approach leverages such knowledge with two representations, one generating attention maps, and the other mapping attended input words to output symbols. We reduce the entropy in each representation to improve generalization. Our experiments demonstrate significant improvements over the conventional methods in five NLP tasks including instruction learning and machine translation. In the SCAN domain, it boosts accuracies from 14.0 to 99.7 learning task. We hope the proposed approach can help ease future research towards human-level compositional language learning.

READ FULL TEXT
02/08/2021

Concepts, Properties and an Approach for Compositional Generalization

Compositional generalization is the capacity to recognize and imagine a ...
07/17/2020

Compositional Generalization in Semantic Parsing: Pre-training vs. Specialized Architectures

While mainstream machine learning methods are known to have limited abil...
12/12/2022

Real-World Compositional Generalization with Disentangled Sequence-to-Sequence Learning

Compositional generalization is a basic mechanism in human language lear...
04/22/2019

Compositional generalization in a deep seq2seq model by separating syntax and semantics

Standard methods in deep learning for natural language processing fail t...
01/19/2020

Learning Compositional Neural Information Fusion for Human Parsing

This work proposes to combine neural networks with the compositional hie...
07/03/2021

Can Transformers Jump Around Right in Natural Language? Assessing Performance Transfer from SCAN

Despite their practical success, modern seq2seq architectures are unable...
02/19/2019

Measuring Compositionality in Representation Learning

Many machine learning algorithms represent input data with vector embedd...