Attentive Tensor Product Learning for Language Generation and Grammar Parsing

02/20/2018
by   Qiuyuan Huang, et al.
0

This paper proposes a new architecture - Attentive Tensor Product Learning (ATPL) - to represent grammatical structures in deep learning models. ATPL is a new architecture to bridge this gap by exploiting Tensor Product Representations (TPR), a structured neural-symbolic model developed in cognitive science, aiming to integrate deep learning with explicit language structures and rules. The key ideas of ATPL are: 1) unsupervised learning of role-unbinding vectors of words via TPR-based deep neural network; 2) employing attention modules to compute TPR; and 3) integration of TPR with typical deep learning architectures including Long Short-Term Memory (LSTM) and Feedforward Neural Network (FFNN). The novelty of our approach lies in its ability to extract the grammatical structure of a sentence by using role-unbinding vectors, which are obtained in an unsupervised manner. This ATPL approach is applied to 1) image captioning, 2) part of speech (POS) tagging, and 3) constituency parsing of a sentence. Experimental results demonstrate the effectiveness of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2017

A Neural-Symbolic Approach to Natural Language Tasks

Deep learning (DL) has in recent years been widely used in natural langu...
research
09/26/2017

Tensor Product Generation Networks

We present a new tensor product generation network (TPGN) that generates...
research
07/02/2015

Simple, Fast Semantic Parsing with a Tensor Kernel

We describe a simple approach to semantic parsing based on a tensor prod...
research
10/29/2018

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...
research
09/26/2017

Tensor Product Generation Networks for Deep NLP Modeling

We present a new approach to the design of deep networks for natural lan...
research
09/10/2021

Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars

In computational linguistics, it has been shown that hierarchical struct...
research
01/10/2019

Context Aware Machine Learning

We propose a principle for exploring context in machine learning models....

Please sign up or login with your details

Forgot password? Click here to reset