Natural- to formal-language generation using Tensor Product Representations

10/05/2019
by   Kezhen Chen, et al.
0

Generating formal-language represented by relational tuples, such as Lisp programs or mathematical expressions, from a natural-language input is an extremely challenging task because it requires to explicitly capture discrete symbolic structural information from the input to generate the output. Most state-of-the-art neural sequence models do not explicitly capture such structure information, and thus do not perform well on these tasks. In this paper, we propose a new encoder-decoder model based on Tensor Product Representations (TPRs) for Natural- to Formal-language generation, called TP-N2F. The encoder of TP-N2F employs TPR 'binding' to encode natural-language symbolic structure in vector space and the decoder uses TPR 'unbinding' to generate a sequence of relational tuples, each consisting of a relation (or operation) and a number of arguments, in symbolic space. TP-N2F considerably outperforms LSTM-based Seq2Seq models, creating a new state of the art results on two benchmarks: the MathQA dataset for math problem solving, and the AlgoList dataset for program synthesis. Ablation studies show that improvements are mainly attributed to the use of TPRs in both the encoder and decoder to explicitly capture relational structure information for symbolic reasoning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2019

Pretraining-Based Natural Language Generation for Text Summarization

In this paper, we propose a novel pretraining-based encoder-decoder fram...
research
09/26/2017

Tensor Product Generation Networks

We present a new tensor product generation network (TPGN) that generates...
research
11/02/2018

Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems

Solving math word problems is a challenging task that requires accurate ...
research
10/13/2022

Graph-based Neural Modules to Inspect Attention-based Architectures: A Position Paper

Encoder-decoder architectures are prominent building blocks of state-of-...
research
02/06/2023

Techniques to Improve Neural Math Word Problem Solvers

Developing automatic Math Word Problem (MWP) solvers is a challenging ta...
research
03/10/2018

Learning and analyzing vector encoding of symbolic representations

We present a formal language with expressions denoting general symbol st...
research
11/22/2020

Modelling Compositionality and Structure Dependence in Natural Language

Human beings possess the most sophisticated computational machinery in t...

Please sign up or login with your details

Forgot password? Click here to reset