ACE-NODE: Attentive Co-Evolving Neural Ordinary Differential Equations

05/31/2021
by   Sheo Yon Jhin, et al.
0

Neural ordinary differential equations (NODEs) presented a new paradigm to construct (continuous-time) neural networks. While showing several good characteristics in terms of the number of parameters and the flexibility in constructing neural networks, they also have a couple of well-known limitations: i) theoretically NODEs learn homeomorphic mapping functions only, and ii) sometimes NODEs show numerical instability in solving integral problems. To handle this, many enhancements have been proposed. To our knowledge, however, integrating attention into NODEs has been overlooked for a while. To this end, we present a novel method of attentive dual co-evolving NODE (ACE-NODE): one main NODE for a downstream machine learning task and the other for providing attention to the main NODE. Our ACE-NODE supports both pairwise and elementwise attention. In our experiments, our method outperforms existing NODE-based and non-NODE-based baselines in almost all cases by non-trivial margins.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2021

Attentive Neural Controlled Differential Equations for Time-series Classification and Forecasting

Neural networks inspired by differential equations have proliferated for...
research
11/25/2021

Characteristic Neural Ordinary Differential Equations

We propose Characteristic Neural Ordinary Differential Equations (C-NODE...
research
06/15/2022

On Numerical Integration in Neural Ordinary Differential Equations

The combination of ordinary differential equations and neural networks, ...
research
10/12/2019

On Robustness of Neural Ordinary Differential Equations

Neural ordinary differential equations (ODEs) have been attracting incre...
research
05/31/2021

OCT-GAN: Neural ODE-based Conditional Tabular GANs

Synthesizing tabular data is attracting much attention these days for va...
research
04/18/2023

LTC-SE: Expanding the Potential of Liquid Time-Constant Neural Networks for Scalable AI and Embedded Systems

We present LTC-SE, an improved version of the Liquid Time-Constant (LTC)...
research
05/28/2019

Learning Dynamics of Attention: Human Prior for Interpretable Machine Reasoning

Without relevant human priors, neural networks may learn uninterpretable...

Please sign up or login with your details

Forgot password? Click here to reset