Reconstructing common latent input from time series with the mapper-coach network and error backpropagation

05/05/2021
by   Zsigmond Benkő, et al.
0

A two-module, feedforward neural network architecture called mapper-coach network has been introduced to reconstruct an unobserved, continuous latent variable input, driving two observed dynamical systems. The method has been demonstrated on time series generated by two chaotic logistic maps driven by a hidden third one. The network has been trained to predict one of the observed time series based on its own past and on the other observed time series by error-back propagation. It was shown, that after this prediction have been learned successfully, the activity of the bottleneck neuron, connecting the mapper and the coach module, correlates strongly with the latent common input variable. The method has the potential to reveal hidden components of dynamical systems, where experimental intervention is not possible.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2021

Analysis of chaotic dynamical systems with autoencoders

We focus on chaotic dynamical systems and analyze their time series with...
research
06/03/2022

Constraints on parameter choices for successful reservoir computing

Echo-state networks are simple models of discrete dynamical systems driv...
research
01/31/2023

Recurrences reveal shared causal drivers of complex time series

Many experimental time series measurements share an unobserved causal dr...
research
09/26/2022

Neural State-Space Modeling with Latent Causal-Effect Disentanglement

Despite substantial progress in deep learning approaches to time-series ...
research
09/11/2023

Neural Koopman prior for data assimilation

With the increasing availability of large scale datasets, computational ...
research
02/09/2019

Simulating extrapolated dynamics with parameterization networks

An artificial neural network architecture, parameterization networks, is...
research
07/22/2014

Deep Recurrent Neural Networks for Time Series Prediction

Ability of deep networks to extract high level features and of recurrent...

Please sign up or login with your details

Forgot password? Click here to reset