Transfer Learning using Neural Ordinary Differential Equations

01/21/2020
by   Rajath S, et al.
0

A concept of using Neural Ordinary Differential Equations(NODE) for Transfer Learning has been introduced. In this paper we use the EfficientNets to explore transfer learning on CIFAR-10 dataset. We use NODE for fine-tuning our model. Using NODE for fine tuning provides more stability during training and validation.These continuous depth blocks can also have a trade off between numerical precision and speed .Using Neural ODEs for transfer learning has resulted in much stable convergence of the loss function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2018

On transfer learning using a MAC model variant

We introduce a variant of the MAC model (Hudson and Manning, CVPR 2018) ...
research
04/15/2020

Transfer-Learning-Aware Neuro-Evolution for Diseases Detection in Chest X-Ray Images

The neural network needs excessive costs of time because of the complexi...
research
12/09/2022

Transfer Learning Enhanced DeepONet for Long-Time Prediction of Evolution Equations

Deep operator network (DeepONet) has demonstrated great success in vario...
research
01/07/2022

Forecasting emissions through Kaya identity using Neural Ordinary Differential Equations

Starting from the Kaya identity, we used a Neural ODE model to predict t...
research
01/06/2022

An exploratory experiment on Hindi, Bengali hate-speech detection and transfer learning using neural networks

This work presents our approach to train a neural network to detect hate...
research
03/11/2022

Leveraging universality of jet taggers through transfer learning

A significant challenge in the tagging of boosted objects via machine-le...
research
05/19/2023

Tune-Mode ConvBN Blocks For Efficient Transfer Learning

Convolution-BatchNorm (ConvBN) blocks are integral components in various...

Please sign up or login with your details

Forgot password? Click here to reset