Entanglement-Embedded Recurrent Network Architecture: Tensorized Latent State Propagation and Chaos Forecasting

06/10/2020
by   Xiangyi Meng, et al.
0

Chaotic time series forecasting has been far less understood despite its tremendous potential in theory and real-world applications. Traditional statistical/ML methods are inefficient to capture chaos in nonlinear dynamical systems, especially when the time difference Δ t between consecutive steps is so large that a trivial, ergodic local minimum would most likely be reached instead. Here, we introduce a new long-short-term-memory (LSTM)-based recurrent architecture by tensorizing the cell-state-to-state propagation therein, keeping the long-term memory feature of LSTM while simultaneously enhancing the learning of short-term nonlinear complexity. We stress that the global minima of chaos can be most efficiently reached by tensorization where all nonlinear terms, up to some polynomial order, are treated explicitly and weighted equally. The efficiency and generality of our architecture are systematically tested and confirmed by theoretical analysis and experimental results. In our design, we have explicitly used two different many-body entanglement structures—matrix product states (MPS) and the multiscale entanglement renormalization ansatz (MERA)—as physics-inspired tensor decomposition techniques, from which we find that MERA generally performs better than MPS, hence conjecturing that the learnability of chaos is determined not only by the number of free parameters but also the tensor complexity—recognized as how entanglement entropy scales with varying matricization of the tensor.

READ FULL TEXT
research
09/10/2018

Memristive LSTM network hardware architecture for time-series predictive modeling problem

Analysis of time-series data allows to identify long-term trends and mak...
research
10/24/2018

Precipitation Nowcasting: Leveraging bidirectional LSTM and 1D CNN

Short-term rainfall forecasting, also known as precipitation nowcasting ...
research
05/10/2019

Large-Scale Spectrum Occupancy Learning via Tensor Decomposition and LSTM Networks

A new paradigm for large-scale spectrum occupancy learning based on long...
research
06/09/2020

Tensor train decompositions on recurrent networks

Recurrent neural networks (RNN) such as long-short-term memory (LSTM) ne...
research
10/31/2019

A Dynamically Controlled Recurrent Neural Network for Modeling Dynamical Systems

This work proposes a novel neural network architecture, called the Dynam...
research
06/13/2019

Comparison of Methods for the Assessment of Nonlinearity in Short-Term Heart Rate Variability under different Physiopathological States

Despite the widespread diffusion of nonlinear methods for heart rate var...
research
11/13/2018

Multiscale Information Storage of Linear Long-Range Correlated Stochastic Processes

Information storage, reflecting the capability of a dynamical system to ...

Please sign up or login with your details

Forgot password? Click here to reset