DeepAI AI Chat
Log In Sign Up

CRUR: Coupled-Recurrent Unit for Unification, Conceptualization and Context Capture for Language Representation – A Generalization of Bi Directional LSTM

by   Chiranjib Sur, et al.

In this work we have analyzed a novel concept of sequential binding based learning capable network based on the coupling of recurrent units with Bayesian prior definition. The coupling structure encodes to generate efficient tensor representations that can be decoded to generate efficient sentences and can describe certain events. These descriptions are derived from structural representations of visual features of images and media. An elaborated study of the different types of coupling recurrent structures are studied and some insights of their performance are provided. Supervised learning performance for natural language processing is judged based on statistical evaluations, however, the truth is perspective, and in this case the qualitative evaluations reveal the real capability of the different architectural strengths and variations. Bayesian prior definition of different embedding helps in better characterization of the sentences based on the natural language structure related to parts of speech and other semantic level categorization in a form which is machine interpret-able and inherits the characteristics of the Tensor Representation binding and unbinding based on the mutually orthogonality. Our approach has surpassed some of the existing basic works related to image captioning.


page 1

page 17

page 18

page 19


Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations

Widely used recurrent units, including Long-short Term Memory (LSTM) and...

Tensor Product Generation Networks

We present a new tensor product generation network (TPGN) that generates...

ROSE: A Neurocomputational Architecture for Syntax

A comprehensive model of natural language processing in the brain must a...

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...

aiTPR: Attribute Interaction-Tensor Product Representation for Image Caption

Region visual features enhance the generative capability of the machines...

Learning to generalize to new compositions in image understanding

Recurrent neural networks have recently been used for learning to descri...