Improving Neural ODEs via Knowledge Distillation

03/10/2022
by   Haoyu Chu, et al.
0

Neural Ordinary Differential Equations (Neural ODEs) construct the continuous dynamics of hidden units using ordinary differential equations specified by a neural network, demonstrating promising results on many tasks. However, Neural ODEs still do not perform well on image recognition tasks. The possible reason is that the one-hot encoding vector commonly used in Neural ODEs can not provide enough supervised information. We propose a new training based on knowledge distillation to construct more powerful and robust Neural ODEs fitting image recognition tasks. Specially, we model the training of Neural ODEs into a teacher-student learning process, in which we propose ResNets as the teacher model to provide richer supervised information. The experimental results show that the new training manner can improve the classification accuracy of Neural ODEs by 24 quantitatively discuss the effect of both knowledge distillation and time horizon in Neural ODEs on robustness against adversarial examples. The experimental analysis concludes that introducing the knowledge distillation and increasing the time horizon can improve the robustness of Neural ODEs against adversarial examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2022

ARDIR: Improving Robustness using Knowledge Distillation of Internal Representation

Adversarial training is the most promising method for learning robust mo...
research
03/14/2022

On the benefits of knowledge distillation for adversarial robustness

Knowledge distillation is normally used to compress a big network, or te...
research
10/16/2019

A Generalized and Robust Method Towards Practical Gaze Estimation on Smart Phone

Gaze estimation for ordinary smart phone, e.g. estimating where the user...
research
06/18/2020

STEER : Simple Temporal Regularization For Neural ODEs

Training Neural Ordinary Differential Equations (ODEs) is often computat...
research
02/09/2021

MALI: A memory efficient and reverse accurate integrator for Neural ODEs

Neural ordinary differential equations (Neural ODEs) are a new family of...
research
11/03/2021

LTD: Low Temperature Distillation for Robust Adversarial Training

Adversarial training has been widely used to enhance the robustness of t...
research
05/14/2023

Improving Defensive Distillation using Teacher Assistant

Adversarial attacks pose a significant threat to the security and safety...

Please sign up or login with your details

Forgot password? Click here to reset