Gated Complex Recurrent Neural Networks

06/21/2018
by   Moritz Wolter, et al.
0

Complex numbers have long been favoured for digital signal processing, yet complex representations rarely appear in deep learning architectures. RNNs, widely used to process time series and sequence information, could greatly benefit from complex representations. We present a novel complex gate recurrent cell. When used together with norm-preserving state transition matrices, our complex gated RNN exhibits excellent stability and convergence properties. We demonstrate competitive performance of our complex gated RNN on the synthetic memory and adding task, as well as on the real-world task of human motion prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/31/2016

Minimal Gated Unit for Recurrent Neural Networks

Recently recurrent neural networks (RNN) has been very successful in han...
research
12/30/2018

Comparison between DeepESNs and gated RNNs on multivariate time-series prediction

We propose an experimental comparison between Deep Echo State Networks (...
research
01/08/2019

FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network

This paper develops the FastRNN and FastGRNN algorithms to address the t...
research
12/13/2018

Fourier RNNs for Sequence Analysis and Prediction

Fourier methods have a long and proven track record in as an excellent t...
research
07/21/2022

The Neural Race Reduction: Dynamics of Abstraction in Gated Networks

Our theoretical understanding of deep learning has not kept pace with it...
research
10/25/2018

Learning with Interpretable Structure from RNN

In structure learning, the output is generally a structure that is used ...
research
02/03/2020

Gated Graph Recurrent Neural Networks

Graph processes exhibit a temporal structure determined by the sequence ...

Please sign up or login with your details

Forgot password? Click here to reset