Self-Training and Multi-Task Learning for Limited Data: Evaluation Study on Object Detection

09/12/2023
by   Hoang-An Le, et al.
0

Self-training allows a network to learn from the predictions of a more complicated model, thus often requires well-trained teacher models and mixture of teacher-student data while multi-task learning jointly optimizes different targets to learn salient interrelationship and requires multi-task annotations for each training example. These frameworks, despite being particularly data demanding have potentials for data exploitation if such assumptions can be relaxed. In this paper, we compare self-training object detection under the deficiency of teacher training data where students are trained on unseen examples by the teacher, and multi-task learning with partially annotated data, i.e. single-task annotation per training example. Both scenarios have their own limitation but potentially helpful with limited annotated data. Experimental results show the improvement of performance when using a weak teacher with unseen data for training a multi-task student. Despite the limited setup we believe the experimental results show the potential of multi-task knowledge distillation and self-training, which could be beneficial for future study. Source code is at https://lhoangan.github.io/multas.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2019

Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation for Pretrained Models

In this paper, we explore the knowledge distillation approach under the ...
research
08/25/2021

Multi-Task Self-Training for Learning General Representations

Despite the fast progress in training specialized models for various tas...
research
11/15/2020

Anomaly Detection in Video via Self-Supervised and Multi-Task Learning

Anomaly detection in video is a challenging computer vision problem. Due...
research
03/24/2022

Multitask Emotion Recognition Model with Knowledge Distillation and Task Discriminator

Due to the collection of big data and the development of deep learning, ...
research
05/16/2020

Neural Multi-Task Learning for Teacher Question Detection in Online Classrooms

Asking questions is one of the most crucial pedagogical techniques used ...
research
06/21/2018

Gradient Adversarial Training of Neural Networks

We propose gradient adversarial training, an auxiliary deep learning fra...
research
11/04/2022

1Cademy @ Causal News Corpus 2022: Leveraging Self-Training in Causality Classification of Socio-Political Event Data

This paper details our participation in the Challenges and Applications ...

Please sign up or login with your details

Forgot password? Click here to reset