Compact Autoregressive Network

09/06/2019
by   Di Wang, et al.
0

Autoregressive networks can achieve promising performance in many sequence modeling tasks with short-range dependence. However, when handling high-dimensional inputs and outputs, the huge amount of parameters in the network lead to expensive computational cost and low learning efficiency. The problem can be alleviated slightly by introducing one more narrow hidden layer to the network, but the sample size required to achieve a certain training error is still large. To address this challenge, we rearrange the weight matrices of a linear autoregressive network into a tensor form, and then make use of Tucker decomposition to represent low-rank structures. This leads to a novel compact autoregressive network, called Tucker AutoRegressive (TAR) net. Interestingly, the TAR net can be applied to sequences with long-range dependence since the dimension along the sequential order is reduced. Theoretical studies show that the TAR net improves the learning efficiency, and requires much fewer samples for model training. Experiments on synthetic and real-world datasets demonstrate the promising performance of the proposed compact network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2018

Compressing Recurrent Neural Networks with Tensor Ring for Action Recognition

Recurrent Neural Networks (RNNs) and their variants, such as Long-Short ...
research
12/09/2018

Low Rank and Structured Modeling of High-dimensional Vector Autoregressions

Network modeling of high-dimensional time series data is a key learning ...
research
08/07/2020

Scalable Low-Rank Autoregressive Tensor Learning for Spatiotemporal Traffic Data Imputation

Missing value problem in spatiotemporal traffic data has long been a cha...
research
06/08/2023

SequenceMatch: Imitation Learning for Autoregressive Sequence Modelling with Backtracking

In many domains, autoregressive models can achieve low log-likelihood on...
research
07/06/2017

Tensor-Train Recurrent Neural Networks for Video Classification

The Recurrent Neural Networks and their variants have shown promising pe...
research
05/19/2015

oASIS: Adaptive Column Sampling for Kernel Matrix Approximation

Kernel matrices (e.g. Gram or similarity matrices) are essential for man...

Please sign up or login with your details

Forgot password? Click here to reset