CRUR: Coupled-Recurrent Unit for Unification, Conceptualization and Context Capture for Language Representation – A Generalization of Bi Directional LSTM

11/22/2019
by   Chiranjib Sur, et al.
0

In this work we have analyzed a novel concept of sequential binding based learning capable network based on the coupling of recurrent units with Bayesian prior definition. The coupling structure encodes to generate efficient tensor representations that can be decoded to generate efficient sentences and can describe certain events. These descriptions are derived from structural representations of visual features of images and media. An elaborated study of the different types of coupling recurrent structures are studied and some insights of their performance are provided. Supervised learning performance for natural language processing is judged based on statistical evaluations, however, the truth is perspective, and in this case the qualitative evaluations reveal the real capability of the different architectural strengths and variations. Bayesian prior definition of different embedding helps in better characterization of the sentences based on the natural language structure related to parts of speech and other semantic level categorization in a form which is machine interpret-able and inherits the characteristics of the Tensor Representation binding and unbinding based on the mutually orthogonality. Our approach has surpassed some of the existing basic works related to image captioning.

READ FULL TEXT

page 1

page 17

page 18

page 19

research
12/17/2018

Feature Fusion Effects of Tensor Product Representation on (De)Compositional Network for Caption Generation for Images

Progress in image captioning is gradually getting complex as researchers...
research
10/29/2018

Learning Distributed Representations of Symbolic Structure Using Binding and Unbinding Operations

Widely used recurrent units, including Long-short Term Memory (LSTM) and...
research
09/26/2017

Tensor Product Generation Networks

We present a new tensor product generation network (TPGN) that generates...
research
03/15/2023

ROSE: A Neurocomputational Architecture for Syntax

A comprehensive model of natural language processing in the brain must a...
research
10/29/2018

A Simple Recurrent Unit with Reduced Tensor Product Representations

idely used recurrent units, including Long-short Term Memory (LSTM) and ...
research
01/27/2020

aiTPR: Attribute Interaction-Tensor Product Representation for Image Caption

Region visual features enhance the generative capability of the machines...
research
08/27/2016

Learning to generalize to new compositions in image understanding

Recurrent neural networks have recently been used for learning to descri...

Please sign up or login with your details

Forgot password? Click here to reset