A Neural-Symbolic Approach to Natural Language Tasks

10/29/2017
by   Qiuyuan Huang, et al.
0

Deep learning (DL) has in recent years been widely used in natural language processing (NLP) applications due to its superior performance. However, while natural languages are rich in grammatical structure, DL has not been able to explicitly represent and enforce such structures. This paper proposes a new architecture to bridge this gap by exploiting tensor product representations (TPR), a structured neural-symbolic framework developed in cognitive science over the past 20 years, with the aim of integrating DL with explicit language structures and rules. We call it the Tensor Product Generation Network (TPGN), and apply it to 1) image captioning, 2) classification of the part of speech of a word, and 3) identification of the phrase structure of a sentence. The key ideas of TPGN are: 1) unsupervised learning of role-unbinding vectors of words via a TPR-based deep neural network, and 2) integration of TPR with typical DL architectures including Long Short-Term Memory (LSTM) models. The novelty of our approach lies in its ability to generate a sentence and extract partial grammatical structure of the sentence by using role-unbinding vectors, which are obtained in an unsupervised manner. Experimental results demonstrate the effectiveness of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2018

Attentive Tensor Product Learning for Language Generation and Grammar Parsing

This paper proposes a new architecture - Attentive Tensor Product Learni...
research
10/29/2018

Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations

Widely used recurrent units, including Long-short Term Memory (LSTM) and...
research
10/29/2018

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...
research
09/26/2017

Tensor Product Generation Networks for Deep NLP Modeling

We present a new approach to the design of deep networks for natural lan...
research
01/11/2017

De-identification In practice

We report our effort to identify the sensitive information, subset of da...
research
10/15/2019

Exploring Overall Contextual Information for Image Captioning in Human-Like Cognitive Style

Image captioning is a research hotspot where encoder-decoder models comb...
research
10/01/2020

How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text

Long Short-Term Memory recurrent neural network (LSTM) is widely used an...

Please sign up or login with your details

Forgot password? Click here to reset