-
Learning to Recombine and Resample Data for Compositional Generalization
Flexible neural models outperform grammar- and automaton-based counterpa...
read it
-
Compositional Generalization by Learning Analytical Expressions
Compositional generalization is a basic but essential intellective capab...
read it
-
Neural-Symbolic Integration: A Compositional Perspective
Despite significant progress in the development of neural-symbolic frame...
read it
-
Still not systematic after all these years: On the compositional skills of sequence-to-sequence recurrent networks
Humans can understand and produce new utterances effortlessly, thanks to...
read it
-
The Neural Network Pushdown Automaton: Model, Stack and Learning Simulations
In order for neural networks to learn complex languages or grammars, the...
read it
-
Sequence-Level Mixed Sample Data Augmentation
Despite their empirical success, neural networks still have difficulty c...
read it
-
Discovering the Compositional Structure of Vector Representations with Role Learning Networks
Neural networks (NNs) are able to perform tasks that rely on composition...
read it
Compositional Generalization via Neural-Symbolic Stack Machines
Despite achieving tremendous success, existing deep learning models have exposed limitations in compositional generalization, the capability to learn compositional rules and apply them to unseen cases in a systematic manner. To tackle this issue, we propose the Neural-Symbolic Stack Machine (NeSS). It contains a neural network to generate traces, which are then executed by a symbolic stack machine enhanced with sequence manipulation operations. NeSS combines the expressive power of neural sequence models with the recursion supported by the symbolic stack machine. Without training supervision on execution traces, NeSS achieves 100 domains: the SCAN benchmark of language-driven navigation tasks, the compositional machine translation benchmark, and context-free grammar parsing tasks.
READ FULL TEXT
Comments
There are no comments yet.