Interpolated Adjoint Method for Neural ODEs

03/11/2020
by   Talgat Daulbaev, et al.
12

In this paper, we propose a method, which allows us to alleviate or completely avoid the notorious problem of numerical instability and stiffness of the adjoint method for training neural ODE. On the backward pass, we propose to use the machinery of smooth function interpolation to restore the trajectory obtained during the forward integration. We show the viability of our approach, both in theory and practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/23/2023

Efficient Training of Deep Equilibrium Models

Deep equilibrium models (DEQs) have proven to be very powerful for learn...
research
06/22/2020

Forward-Backward RRT: Branched Sampled FBSDEs for Stochastic Optimal Control

We propose a numerical method to solve forward-backward stochastic diffe...
research
02/02/2023

Energy Efficient Training of SNN using Local Zeroth Order Method

Spiking neural networks are becoming increasingly popular for their low ...
research
01/22/2022

The Forward-Backward Envelope for Sampling with the Overdamped Langevin Algorithm

In this paper, we analyse a proximal method based on the idea of forward...
research
08/18/2022

Lifted Bregman Training of Neural Networks

We introduce a novel mathematical formulation for the training of feed-f...
research
06/22/2020

Bidirectional Self-Normalizing Neural Networks

The problem of exploding and vanishing gradients has been a long-standin...
research
06/18/2020

STEER : Simple Temporal Regularization For Neural ODEs

Training Neural Ordinary Differential Equations (ODEs) is often computat...

Please sign up or login with your details

Forgot password? Click here to reset