Towards Understanding Normalization in Neural ODEs

04/20/2020
by   Julia Gusak, et al.
2

Normalization is an important and vastly investigated technique in deep learning. However, its role for Ordinary Differential Equation based networks (neural ODEs) is still poorly understood. This paper investigates how different normalization techniques affect the performance of neural ODEs. Particularly, we show that it is possible to achieve 93 classification task, and to the best of our knowledge, this is the highest reported accuracy among neural ODEs tested on this problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2022

New Designed Loss Functions to Solve Ordinary Differential Equations with Artificial Neural Network

This paper investigates the use of artificial neural networks (ANNs) to ...
research
07/13/2023

Investigating Normalization in Preference-based Evolutionary Multi-objective Optimization Using a Reference Point

Normalization of objectives plays a crucial role in evolutionary multi-o...
research
08/01/2020

Neural ODE with Temporal Convolution and Time Delay Neural Networks for Small-Footprint Keyword Spotting

In this paper, we propose neural network models based on the neural ordi...
research
09/21/2022

Neural Generalized Ordinary Differential Equations with Layer-varying Parameters

Deep residual networks (ResNets) have shown state-of-the-art performance...
research
09/27/2020

Normalization Techniques in Training DNNs: Methodology, Analysis and Application

Normalization techniques are essential for accelerating the training and...
research
03/19/2020

Exemplar Normalization for Learning Deep Representation

Normalization techniques are important in different advanced neural netw...
research
02/29/2020

Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs

Batch normalization (BatchNorm) has become an indispensable tool for tra...

Please sign up or login with your details

Forgot password? Click here to reset