DeepAI AI Chat
Log In Sign Up

Compositional Generalization via Neural-Symbolic Stack Machines

by   Xinyun Chen, et al.

Despite achieving tremendous success, existing deep learning models have exposed limitations in compositional generalization, the capability to learn compositional rules and apply them to unseen cases in a systematic manner. To tackle this issue, we propose the Neural-Symbolic Stack Machine (NeSS). It contains a neural network to generate traces, which are then executed by a symbolic stack machine enhanced with sequence manipulation operations. NeSS combines the expressive power of neural sequence models with the recursion supported by the symbolic stack machine. Without training supervision on execution traces, NeSS achieves 100 domains: the SCAN benchmark of language-driven navigation tasks, the compositional machine translation benchmark, and context-free grammar parsing tasks.


page 1

page 2

page 3

page 4


Sequence-to-Sequence Learning with Latent Neural Grammars

Sequence-to-sequence learning with neural networks has become the de fac...

Learning to Recombine and Resample Data for Compositional Generalization

Flexible neural models outperform grammar- and automaton-based counterpa...

Grammar-Based Grounded Lexicon Learning

We present Grammar-Based Grounded Lexicon Learning (G2L2), a lexicalist ...

Neural-Symbolic Integration: A Compositional Perspective

Despite significant progress in the development of neural-symbolic frame...

Compositional Generalization by Learning Analytical Expressions

Compositional generalization is a basic but essential intellective capab...

Neural-Symbolic Inference for Robust Autoregressive Graph Parsing via Compositional Uncertainty Quantification

Pre-trained seq2seq models excel at graph semantic parsing with rich ann...

Differentiable Tree Operations Promote Compositional Generalization

In the context of structure-to-structure transformation tasks, learning ...