Tensor Product Generation Networks

09/26/2017
by   Qiuyuan Huang, et al.
0

We present a new tensor product generation network (TPGN) that generates natural language descriptions for images. The model has a novel architecture that instantiates a general framework for encoding and processing symbolic structure through neural network computation. This framework is built on Tensor Product Representations (TPRs). We evaluated the proposed TPGN on the MS COCO image captioning task. The experimental results show that the TPGN outperforms the LSTM based state-of-the-art baseline with a significant margin. Further, we show that our caption generation model can be interpreted as generating sequences of grammatical categories and retrieving words by their categories from a plan encoded as a distributed representation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2017

Tensor Product Generation Networks for Deep NLP Modeling

We present a new approach to the design of deep networks for natural lan...
research
12/17/2018

Feature Fusion Effects of Tensor Product Representation on (De)Compositional Network for Caption Generation for Images

Progress in image captioning is gradually getting complex as researchers...
research
02/20/2018

Attentive Tensor Product Learning for Language Generation and Grammar Parsing

This paper proposes a new architecture - Attentive Tensor Product Learni...
research
10/05/2019

Natural- to formal-language generation using Tensor Product Representations

Generating formal-language represented by relational tuples, such as Lis...
research
11/22/2019

TPsgtR: Neural-Symbolic Tensor Product Scene-Graph-Triplet Representation for Image Captioning

Image captioning can be improved if the structure of the graphical repre...
research
04/30/2019

PR Product: A Substitute for Inner Product in Neural Networks

In this paper, we analyze the inner product of weight vector and input v...

Please sign up or login with your details

Forgot password? Click here to reset