KDCTime: Knowledge Distillation with Calibration on InceptionTime for Time-series Classification

12/04/2021
by   Xueyuan Gong, et al.
0

Time-series classification approaches based on deep neural networks are easy to be overfitting on UCR datasets, which is caused by the few-shot problem of those datasets. Therefore, in order to alleviate the overfitting phenomenon for further improving the accuracy, we first propose Label Smoothing for InceptionTime (LSTime), which adopts the information of soft labels compared to just hard labels. Next, instead of manually adjusting soft labels by LSTime, Knowledge Distillation for InceptionTime (KDTime) is proposed in order to automatically generate soft labels by the teacher model. At last, in order to rectify the incorrect predicted soft labels from the teacher model, Knowledge Distillation with Calibration for InceptionTime (KDCTime) is proposed, where it contains two optional calibrating strategies, i.e. KDC by Translating (KDCT) and KDC by Reordering (KDCR). The experimental results show that the accuracy of KDCTime is promising, while its inference time is two orders of magnitude faster than ROCKET with an acceptable training time overhead.

READ FULL TEXT

page 1

page 11

research
10/20/2020

Knowledge Distillation in Wide Neural Networks: Risk Bound, Data Efficiency and Imperfect Teacher

Knowledge distillation is a strategy of training a student network with ...
research
07/03/2021

Isotonic Data Augmentation for Knowledge Distillation

Knowledge distillation uses both real hard labels and soft labels predic...
research
02/16/2023

Learning From Biased Soft Labels

Knowledge distillation has been widely adopted in a variety of tasks and...
research
03/04/2022

Better Supervisory Signals by Observing Learning Paths

Better-supervised models might have better performance. In this paper, w...
research
01/31/2018

Model compression for faster structural separation of macromolecules captured by Cellular Electron Cryo-Tomography

Electron Cryo-Tomography (ECT) enables 3D visualization of macromolecule...
research
03/28/2023

Dice Semimetric Losses: Optimizing the Dice Score with Soft Labels

The soft Dice loss (SDL) has taken a pivotal role in many automated segm...
research
10/03/2022

Robust Active Distillation

Distilling knowledge from a large teacher model to a lightweight one is ...

Please sign up or login with your details

Forgot password? Click here to reset