Sequence to Logic with Copy and Cache

07/19/2018
by   Javid Dadashkarimi, et al.
0

Generating logical form equivalents of human language is a fresh way to employ neural architectures where long short-term memory effectively captures dependencies in both encoder and decoder units. The logical form of the sequence usually preserves information from the natural language side in the form of similar tokens, and recently a copying mechanism has been proposed which increases the probability of outputting tokens from the source input through decoding. In this paper we propose a caching mechanism as a more general form of the copying mechanism which also weighs all the words from the source vocabulary according to their relation to the current decoding context. Our results confirm that the proposed method achieves improvements in sequence/token-level accuracy on sequence to logical form tasks. Further experiments on cross-domain adversarial attacks show substantial improvements when using the most influential examples of other domains for training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/25/2016

Long Short-Term Memory-Networks for Machine Reading

In this paper we address the question of how to render sequence-level ne...
research
08/06/2016

Encoder-decoder with Focus-mechanism for Sequence Labelling Based Spoken Language Understanding

This paper investigates the framework of encoder-decoder with attention ...
research
10/07/2019

Parallel Iterative Edit Models for Local Sequence Transduction

We present a Parallel Iterative Edit (PIE) model for the problem of loca...
research
08/29/2019

Translating Mathematical Formula Images to LaTeX Sequences Using Deep Neural Networks with Sequence-level Training

In this paper we propose a deep neural network model with an encoder-dec...
research
10/15/2020

Diverse Keyphrase Generation with Neural Unlikelihood Training

In this paper, we study sequence-to-sequence (S2S) keyphrase generation ...
research
09/26/2021

BioCopy: A Plug-And-Play Span Copy Mechanism in Seq2Seq Models

Copy mechanisms explicitly obtain unchanged tokens from the source (inpu...
research
11/10/2016

Efficient Summarization with Read-Again and Copy Mechanism

Encoder-decoder models have been widely used to solve sequence to sequen...

Please sign up or login with your details

Forgot password? Click here to reset