On the Query Strategies for Efficient Online Active Distillation

09/04/2023
by   Michele Boldo, et al.
0

Deep Learning (DL) requires lots of time and data, resulting in high computational demands. Recently, researchers employ Active Learning (AL) and online distillation to enhance training efficiency and real-time model adaptation. This paper evaluates a set of query strategies to achieve the best training results. It focuses on Human Pose Estimation (HPE) applications, assessing the impact of selected frames during training using two approaches: a classical offline method and a online evaluation through a continual learning approach employing knowledge distillation, on a popular state-of-the-art HPE dataset. The paper demonstrates the possibility of enabling training at the edge lightweight models, adapting them effectively to new contexts in real-time.

READ FULL TEXT
research
09/06/2023

Rethinking Momentum Knowledge Distillation in Online Continual Learning

Online Continual Learning (OCL) addresses the problem of training neural...
research
04/20/2022

HRPose: Real-Time High-Resolution 6D Pose Estimation Network Using Knowledge Distillation

Real-time 6D object pose estimation is essential for many real-world app...
research
09/26/2021

Improving Question Answering Performance Using Knowledge Distillation and Active Learning

Contemporary question answering (QA) systems, including transformer-base...
research
04/08/2023

PVD-AL: Progressive Volume Distillation with Active Learning for Efficient Conversion Between Different NeRF Architectures

Neural Radiance Fields (NeRF) have been widely adopted as practical and ...
research
03/14/2023

MetaMixer: A Regularization Strategy for Online Knowledge Distillation

Online knowledge distillation (KD) has received increasing attention in ...
research
02/17/2023

Explicit and Implicit Knowledge Distillation via Unlabeled Data

Data-free knowledge distillation is a challenging model lightweight task...
research
07/06/2021

CoReD: Generalizing Fake Media Detection with Continual Representation using Distillation

Over the last few decades, artificial intelligence research has made tre...

Please sign up or login with your details

Forgot password? Click here to reset