Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks

04/11/2022
by   Ryan Vogt, et al.
0

Recurrent Neural Networks (RNN) are ubiquitous computing systems for sequences and multivariate time series data. While several robust architectures of RNN are known, it is unclear how to relate RNN initialization, architecture, and other hyperparameters with accuracy for a given task. In this work, we propose to treat RNN as dynamical systems and to correlate hyperparameters with accuracy through Lyapunov spectral analysis, a methodology specifically designed for nonlinear dynamical systems. To address the fact that RNN features go beyond the existing Lyapunov spectral analysis, we propose to infer relevant features from the Lyapunov spectrum with an Autoencoder and an embedding of its latent representation (AeLLE). Our studies of various RNN architectures show that AeLLE successfully correlates RNN Lyapunov spectrum with accuracy. Furthermore, the latent representation learned by AeLLE is generalizable to novel inputs from the same task and is formed early in the process of RNN training. The latter property allows for the prediction of the accuracy to which RNN would converge when training is complete. We conclude that representation of RNN through Lyapunov spectrum along with AeLLE, and assists with hyperparameter selection of RNN, provides a novel method for organization and interpretation of variants of RNN architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2020

On Lyapunov Exponents for RNNs: Understanding Information Propagation Using Dynamical Systems Tools

Recurrent neural networks (RNNs) have been successfully applied to a var...
research
09/10/2016

Multiplex visibility graphs to investigate recurrent neural networks dynamics

A recurrent neural network (RNN) is a universal approximator of dynamica...
research
09/21/2021

Recurrent Neural Networks for Partially Observed Dynamical Systems

Complex nonlinear dynamics are ubiquitous in many fields. Moreover, we r...
research
02/04/2021

Forecasting Using Reservoir Computing: The Role of Generalized Synchronization

Reservoir computers (RC) are a form of recurrent neural network (RNN) us...
research
12/15/2022

Multimodal Teacher Forcing for Reconstructing Nonlinear Dynamical Systems

Many, if not most, systems of interest in science are naturally describe...
research
01/21/2022

On the adaptation of recurrent neural networks for system identification

This paper presents a transfer learning approach which enables fast and ...
research
10/08/2019

Inferring Dynamical Systems with Long-Range Dependencies through Line Attractor Regularization

Vanilla RNN with ReLU activation have a simple structure that is amenabl...

Please sign up or login with your details

Forgot password? Click here to reset