Framing RNN as a kernel method: A neural ODE approach

06/02/2021
by   Adeline Fermanian, et al.
0

Building on the interpretation of a recurrent neural network (RNN) as a continuous-time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

Theoretical Guarantees for Learning Conditional Expectation using Controlled ODE-RNN

Continuous stochastic processes are widely used to model time series tha...
research
06/22/2020

Understanding Recurrent Neural Networks Using Nonequilibrium Response Theory

Recurrent neural networks (RNNs) are brain-inspired models widely used i...
research
08/23/2023

Neural oscillators for magnetic hysteresis modeling

Hysteresis is a ubiquitous phenomenon in science and engineering; its mo...
research
11/26/2021

On Recurrent Neural Networks for learning-based control: recent results and ideas for future developments

This paper aims to discuss and analyze the potentialities of Recurrent N...
research
11/18/2019

Action Anticipation with RBF Kernelized Feature Mapping RNN

We introduce a novel Recurrent Neural Network-based algorithm for future...
research
11/18/2019

Action Anticipation with RBF KernelizedFeature Mapping RNN

We introduce a novel Recurrent Neural Network-based algorithm for future...
research
03/29/2023

Learning Flow Functions from Data with Applications to Nonlinear Oscillators

We describe a recurrent neural network (RNN) based architecture to learn...

Please sign up or login with your details

Forgot password? Click here to reset