Recurrent Graph Tensor Networks

09/18/2020
by   Yao Lei Xu, et al.
0

Recurrent Neural Networks (RNNs) are among the most successful machine learning models for sequence modelling. In this paper, we show that the modelling of hidden states in RNNs can be approximated through a multi-linear graph filter, which describes the directional flow of temporal information. The derived multi-linear graph filter is then generalized in tensor network form to improve its modelling power, resulting in a novel Recurrent Graph Tensor Network (RGTN). To validate the expressive power of the derived network, several variants RGTN models were porposed and employed to the task of time-series forecasting, demonstrating superior properties in terms of convergence, performance, and complexity. Specifically, by leveraging the multi-modal nature of tensor networks, RGTN models were able to out-perform a simple RNN by 45 parameters. Therefore, by combining the expressive power of tensor networks with a suitable graph filter, we show that the proposed RGTN can out-perform a classical RNN at a drastically lower parameters complexity, especially in the multi-modal setting.

READ FULL TEXT
research
05/11/2021

Tensor-Train Recurrent Neural Networks for Interpretable Multi-Way Financial Forecasting

Recurrent Neural Networks (RNNs) represent the de facto standard machine...
research
02/28/2018

Tensor Decomposition for Compressing Recurrent Neural Network

In the machine learning fields, Recurrent Neural Network (RNN) has becom...
research
03/27/2021

Tensor Networks for Multi-Modal Non-Euclidean Data

Modern data sources are typically of large scale and multi-modal natures...
research
10/25/2020

Multi-Graph Tensor Networks

The irregular and multi-modal nature of numerous modern data sources pos...
research
01/30/2019

Generalized Tensor Models for Recurrent Neural Networks

Recurrent Neural Networks (RNNs) are very successful at solving challeng...
research
08/04/2023

Universal Approximation of Linear Time-Invariant (LTI) Systems through RNNs: Power of Randomness in Reservoir Computing

Recurrent neural networks (RNNs) are known to be universal approximators...
research
09/22/2017

Attention-based Mixture Density Recurrent Networks for History-based Recommendation

The goal of personalized history-based recommendation is to automaticall...

Please sign up or login with your details

Forgot password? Click here to reset